Nov 23 15:00:51 np0005532763 kernel: Linux version 5.14.0-639.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Sat Nov 15 10:30:41 UTC 2025
Nov 23 15:00:51 np0005532763 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 23 15:00:51 np0005532763 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 23 15:00:51 np0005532763 kernel: BIOS-provided physical RAM map:
Nov 23 15:00:51 np0005532763 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 23 15:00:51 np0005532763 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 23 15:00:51 np0005532763 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 23 15:00:51 np0005532763 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 23 15:00:51 np0005532763 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 23 15:00:51 np0005532763 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 23 15:00:51 np0005532763 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 23 15:00:51 np0005532763 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 23 15:00:51 np0005532763 kernel: NX (Execute Disable) protection: active
Nov 23 15:00:51 np0005532763 kernel: APIC: Static calls initialized
Nov 23 15:00:51 np0005532763 kernel: SMBIOS 2.8 present.
Nov 23 15:00:51 np0005532763 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 23 15:00:51 np0005532763 kernel: Hypervisor detected: KVM
Nov 23 15:00:51 np0005532763 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 23 15:00:51 np0005532763 kernel: kvm-clock: using sched offset of 18496508186 cycles
Nov 23 15:00:51 np0005532763 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 23 15:00:51 np0005532763 kernel: tsc: Detected 2800.000 MHz processor
Nov 23 15:00:51 np0005532763 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 23 15:00:51 np0005532763 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 23 15:00:51 np0005532763 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 23 15:00:51 np0005532763 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 23 15:00:51 np0005532763 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 23 15:00:51 np0005532763 kernel: Using GB pages for direct mapping
Nov 23 15:00:51 np0005532763 kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 23 15:00:51 np0005532763 kernel: ACPI: Early table checksum verification disabled
Nov 23 15:00:51 np0005532763 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 23 15:00:51 np0005532763 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 23 15:00:51 np0005532763 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 23 15:00:51 np0005532763 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 23 15:00:51 np0005532763 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 23 15:00:51 np0005532763 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 23 15:00:51 np0005532763 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 23 15:00:51 np0005532763 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 23 15:00:51 np0005532763 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 23 15:00:51 np0005532763 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 23 15:00:51 np0005532763 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 23 15:00:51 np0005532763 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 23 15:00:51 np0005532763 kernel: No NUMA configuration found
Nov 23 15:00:51 np0005532763 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 23 15:00:51 np0005532763 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Nov 23 15:00:51 np0005532763 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 23 15:00:51 np0005532763 kernel: Zone ranges:
Nov 23 15:00:51 np0005532763 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 23 15:00:51 np0005532763 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 23 15:00:51 np0005532763 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 23 15:00:51 np0005532763 kernel:  Device   empty
Nov 23 15:00:51 np0005532763 kernel: Movable zone start for each node
Nov 23 15:00:51 np0005532763 kernel: Early memory node ranges
Nov 23 15:00:51 np0005532763 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 23 15:00:51 np0005532763 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 23 15:00:51 np0005532763 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 23 15:00:51 np0005532763 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 23 15:00:51 np0005532763 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 23 15:00:51 np0005532763 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 23 15:00:51 np0005532763 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 23 15:00:51 np0005532763 kernel: ACPI: PM-Timer IO Port: 0x608
Nov 23 15:00:51 np0005532763 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 23 15:00:51 np0005532763 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 23 15:00:51 np0005532763 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 23 15:00:51 np0005532763 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 23 15:00:51 np0005532763 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 23 15:00:51 np0005532763 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 23 15:00:51 np0005532763 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 23 15:00:51 np0005532763 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 23 15:00:51 np0005532763 kernel: TSC deadline timer available
Nov 23 15:00:51 np0005532763 kernel: CPU topo: Max. logical packages:   8
Nov 23 15:00:51 np0005532763 kernel: CPU topo: Max. logical dies:       8
Nov 23 15:00:51 np0005532763 kernel: CPU topo: Max. dies per package:   1
Nov 23 15:00:51 np0005532763 kernel: CPU topo: Max. threads per core:   1
Nov 23 15:00:51 np0005532763 kernel: CPU topo: Num. cores per package:     1
Nov 23 15:00:51 np0005532763 kernel: CPU topo: Num. threads per package:   1
Nov 23 15:00:51 np0005532763 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 23 15:00:51 np0005532763 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 23 15:00:51 np0005532763 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 23 15:00:51 np0005532763 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 23 15:00:51 np0005532763 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 23 15:00:51 np0005532763 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 23 15:00:51 np0005532763 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 23 15:00:51 np0005532763 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 23 15:00:51 np0005532763 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 23 15:00:51 np0005532763 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 23 15:00:51 np0005532763 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 23 15:00:51 np0005532763 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 23 15:00:51 np0005532763 kernel: Booting paravirtualized kernel on KVM
Nov 23 15:00:51 np0005532763 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 23 15:00:51 np0005532763 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 23 15:00:51 np0005532763 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 23 15:00:51 np0005532763 kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 23 15:00:51 np0005532763 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 23 15:00:51 np0005532763 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64", will be passed to user space.
Nov 23 15:00:51 np0005532763 kernel: random: crng init done
Nov 23 15:00:51 np0005532763 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 23 15:00:51 np0005532763 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 23 15:00:51 np0005532763 kernel: Fallback order for Node 0: 0 
Nov 23 15:00:51 np0005532763 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 23 15:00:51 np0005532763 kernel: Policy zone: Normal
Nov 23 15:00:51 np0005532763 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 23 15:00:51 np0005532763 kernel: software IO TLB: area num 8.
Nov 23 15:00:51 np0005532763 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 23 15:00:51 np0005532763 kernel: ftrace: allocating 49298 entries in 193 pages
Nov 23 15:00:51 np0005532763 kernel: ftrace: allocated 193 pages with 3 groups
Nov 23 15:00:51 np0005532763 kernel: Dynamic Preempt: voluntary
Nov 23 15:00:51 np0005532763 kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 23 15:00:51 np0005532763 kernel: rcu: #011RCU event tracing is enabled.
Nov 23 15:00:51 np0005532763 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 23 15:00:51 np0005532763 kernel: #011Trampoline variant of Tasks RCU enabled.
Nov 23 15:00:51 np0005532763 kernel: #011Rude variant of Tasks RCU enabled.
Nov 23 15:00:51 np0005532763 kernel: #011Tracing variant of Tasks RCU enabled.
Nov 23 15:00:51 np0005532763 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 23 15:00:51 np0005532763 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 23 15:00:51 np0005532763 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 23 15:00:51 np0005532763 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 23 15:00:51 np0005532763 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 23 15:00:51 np0005532763 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 23 15:00:51 np0005532763 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 23 15:00:51 np0005532763 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 23 15:00:51 np0005532763 kernel: Console: colour VGA+ 80x25
Nov 23 15:00:51 np0005532763 kernel: printk: console [ttyS0] enabled
Nov 23 15:00:51 np0005532763 kernel: ACPI: Core revision 20230331
Nov 23 15:00:51 np0005532763 kernel: APIC: Switch to symmetric I/O mode setup
Nov 23 15:00:51 np0005532763 kernel: x2apic enabled
Nov 23 15:00:51 np0005532763 kernel: APIC: Switched APIC routing to: physical x2apic
Nov 23 15:00:51 np0005532763 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 23 15:00:51 np0005532763 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Nov 23 15:00:51 np0005532763 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 23 15:00:51 np0005532763 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 23 15:00:51 np0005532763 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 23 15:00:51 np0005532763 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 23 15:00:51 np0005532763 kernel: Spectre V2 : Mitigation: Retpolines
Nov 23 15:00:51 np0005532763 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 23 15:00:51 np0005532763 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 23 15:00:51 np0005532763 kernel: RETBleed: Mitigation: untrained return thunk
Nov 23 15:00:51 np0005532763 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 23 15:00:51 np0005532763 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 23 15:00:51 np0005532763 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 23 15:00:51 np0005532763 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 23 15:00:51 np0005532763 kernel: x86/bugs: return thunk changed
Nov 23 15:00:51 np0005532763 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 23 15:00:51 np0005532763 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 23 15:00:51 np0005532763 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 23 15:00:51 np0005532763 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 23 15:00:51 np0005532763 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 23 15:00:51 np0005532763 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 23 15:00:51 np0005532763 kernel: Freeing SMP alternatives memory: 40K
Nov 23 15:00:51 np0005532763 kernel: pid_max: default: 32768 minimum: 301
Nov 23 15:00:51 np0005532763 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 23 15:00:51 np0005532763 kernel: landlock: Up and running.
Nov 23 15:00:51 np0005532763 kernel: Yama: becoming mindful.
Nov 23 15:00:51 np0005532763 kernel: SELinux:  Initializing.
Nov 23 15:00:51 np0005532763 kernel: LSM support for eBPF active
Nov 23 15:00:51 np0005532763 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 23 15:00:51 np0005532763 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 23 15:00:51 np0005532763 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 23 15:00:51 np0005532763 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 23 15:00:51 np0005532763 kernel: ... version:                0
Nov 23 15:00:51 np0005532763 kernel: ... bit width:              48
Nov 23 15:00:51 np0005532763 kernel: ... generic registers:      6
Nov 23 15:00:51 np0005532763 kernel: ... value mask:             0000ffffffffffff
Nov 23 15:00:51 np0005532763 kernel: ... max period:             00007fffffffffff
Nov 23 15:00:51 np0005532763 kernel: ... fixed-purpose events:   0
Nov 23 15:00:51 np0005532763 kernel: ... event mask:             000000000000003f
Nov 23 15:00:51 np0005532763 kernel: signal: max sigframe size: 1776
Nov 23 15:00:51 np0005532763 kernel: rcu: Hierarchical SRCU implementation.
Nov 23 15:00:51 np0005532763 kernel: rcu: #011Max phase no-delay instances is 400.
Nov 23 15:00:51 np0005532763 kernel: smp: Bringing up secondary CPUs ...
Nov 23 15:00:51 np0005532763 kernel: smpboot: x86: Booting SMP configuration:
Nov 23 15:00:51 np0005532763 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 23 15:00:51 np0005532763 kernel: smp: Brought up 1 node, 8 CPUs
Nov 23 15:00:51 np0005532763 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Nov 23 15:00:51 np0005532763 kernel: node 0 deferred pages initialised in 9ms
Nov 23 15:00:51 np0005532763 kernel: Memory: 7765704K/8388068K available (16384K kernel code, 5786K rwdata, 13900K rodata, 4188K init, 7176K bss, 616268K reserved, 0K cma-reserved)
Nov 23 15:00:51 np0005532763 kernel: devtmpfs: initialized
Nov 23 15:00:51 np0005532763 kernel: x86/mm: Memory block size: 128MB
Nov 23 15:00:51 np0005532763 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 23 15:00:51 np0005532763 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 23 15:00:51 np0005532763 kernel: pinctrl core: initialized pinctrl subsystem
Nov 23 15:00:51 np0005532763 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 23 15:00:51 np0005532763 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 23 15:00:51 np0005532763 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 23 15:00:51 np0005532763 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 23 15:00:51 np0005532763 kernel: audit: initializing netlink subsys (disabled)
Nov 23 15:00:51 np0005532763 kernel: audit: type=2000 audit(1763928049.710:1): state=initialized audit_enabled=0 res=1
Nov 23 15:00:51 np0005532763 kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 23 15:00:51 np0005532763 kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 23 15:00:51 np0005532763 kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 23 15:00:51 np0005532763 kernel: cpuidle: using governor menu
Nov 23 15:00:51 np0005532763 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 23 15:00:51 np0005532763 kernel: PCI: Using configuration type 1 for base access
Nov 23 15:00:51 np0005532763 kernel: PCI: Using configuration type 1 for extended access
Nov 23 15:00:51 np0005532763 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 23 15:00:51 np0005532763 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 23 15:00:51 np0005532763 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 23 15:00:51 np0005532763 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 23 15:00:51 np0005532763 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 23 15:00:51 np0005532763 kernel: Demotion targets for Node 0: null
Nov 23 15:00:51 np0005532763 kernel: cryptd: max_cpu_qlen set to 1000
Nov 23 15:00:51 np0005532763 kernel: ACPI: Added _OSI(Module Device)
Nov 23 15:00:51 np0005532763 kernel: ACPI: Added _OSI(Processor Device)
Nov 23 15:00:51 np0005532763 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 23 15:00:51 np0005532763 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 23 15:00:51 np0005532763 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 23 15:00:51 np0005532763 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 23 15:00:51 np0005532763 kernel: ACPI: Interpreter enabled
Nov 23 15:00:51 np0005532763 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 23 15:00:51 np0005532763 kernel: ACPI: Using IOAPIC for interrupt routing
Nov 23 15:00:51 np0005532763 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 23 15:00:51 np0005532763 kernel: PCI: Using E820 reservations for host bridge windows
Nov 23 15:00:51 np0005532763 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 23 15:00:51 np0005532763 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 23 15:00:51 np0005532763 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [3] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [4] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [5] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [6] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [7] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [8] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [9] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [10] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [11] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [12] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [13] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [14] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [15] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [16] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [17] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [18] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [19] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [20] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [21] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [22] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [23] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [24] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [25] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [26] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [27] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [28] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [29] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [30] registered
Nov 23 15:00:51 np0005532763 kernel: acpiphp: Slot [31] registered
Nov 23 15:00:51 np0005532763 kernel: PCI host bridge to bus 0000:00
Nov 23 15:00:51 np0005532763 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 23 15:00:51 np0005532763 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 23 15:00:51 np0005532763 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 23 15:00:51 np0005532763 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 23 15:00:51 np0005532763 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 23 15:00:51 np0005532763 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 23 15:00:51 np0005532763 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 23 15:00:51 np0005532763 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 23 15:00:51 np0005532763 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 23 15:00:51 np0005532763 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 23 15:00:51 np0005532763 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 23 15:00:51 np0005532763 kernel: iommu: Default domain type: Translated
Nov 23 15:00:51 np0005532763 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 23 15:00:51 np0005532763 kernel: SCSI subsystem initialized
Nov 23 15:00:51 np0005532763 kernel: ACPI: bus type USB registered
Nov 23 15:00:51 np0005532763 kernel: usbcore: registered new interface driver usbfs
Nov 23 15:00:51 np0005532763 kernel: usbcore: registered new interface driver hub
Nov 23 15:00:51 np0005532763 kernel: usbcore: registered new device driver usb
Nov 23 15:00:51 np0005532763 kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 23 15:00:51 np0005532763 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 23 15:00:51 np0005532763 kernel: PTP clock support registered
Nov 23 15:00:51 np0005532763 kernel: EDAC MC: Ver: 3.0.0
Nov 23 15:00:51 np0005532763 kernel: NetLabel: Initializing
Nov 23 15:00:51 np0005532763 kernel: NetLabel:  domain hash size = 128
Nov 23 15:00:51 np0005532763 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 23 15:00:51 np0005532763 kernel: NetLabel:  unlabeled traffic allowed by default
Nov 23 15:00:51 np0005532763 kernel: PCI: Using ACPI for IRQ routing
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 23 15:00:51 np0005532763 kernel: vgaarb: loaded
Nov 23 15:00:51 np0005532763 kernel: clocksource: Switched to clocksource kvm-clock
Nov 23 15:00:51 np0005532763 kernel: VFS: Disk quotas dquot_6.6.0
Nov 23 15:00:51 np0005532763 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 23 15:00:51 np0005532763 kernel: pnp: PnP ACPI init
Nov 23 15:00:51 np0005532763 kernel: pnp: PnP ACPI: found 5 devices
Nov 23 15:00:51 np0005532763 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 23 15:00:51 np0005532763 kernel: NET: Registered PF_INET protocol family
Nov 23 15:00:51 np0005532763 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 23 15:00:51 np0005532763 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 23 15:00:51 np0005532763 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 23 15:00:51 np0005532763 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 23 15:00:51 np0005532763 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 23 15:00:51 np0005532763 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 23 15:00:51 np0005532763 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 23 15:00:51 np0005532763 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 23 15:00:51 np0005532763 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 23 15:00:51 np0005532763 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 23 15:00:51 np0005532763 kernel: NET: Registered PF_XDP protocol family
Nov 23 15:00:51 np0005532763 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 23 15:00:51 np0005532763 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 23 15:00:51 np0005532763 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 23 15:00:51 np0005532763 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 23 15:00:51 np0005532763 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 23 15:00:51 np0005532763 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 23 15:00:51 np0005532763 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 110736 usecs
Nov 23 15:00:51 np0005532763 kernel: PCI: CLS 0 bytes, default 64
Nov 23 15:00:51 np0005532763 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 23 15:00:51 np0005532763 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 23 15:00:51 np0005532763 kernel: Trying to unpack rootfs image as initramfs...
Nov 23 15:00:51 np0005532763 kernel: ACPI: bus type thunderbolt registered
Nov 23 15:00:51 np0005532763 kernel: Initialise system trusted keyrings
Nov 23 15:00:51 np0005532763 kernel: Key type blacklist registered
Nov 23 15:00:51 np0005532763 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 23 15:00:51 np0005532763 kernel: zbud: loaded
Nov 23 15:00:51 np0005532763 kernel: integrity: Platform Keyring initialized
Nov 23 15:00:51 np0005532763 kernel: integrity: Machine keyring initialized
Nov 23 15:00:51 np0005532763 kernel: Freeing initrd memory: 85868K
Nov 23 15:00:51 np0005532763 kernel: NET: Registered PF_ALG protocol family
Nov 23 15:00:51 np0005532763 kernel: xor: automatically using best checksumming function   avx       
Nov 23 15:00:51 np0005532763 kernel: Key type asymmetric registered
Nov 23 15:00:51 np0005532763 kernel: Asymmetric key parser 'x509' registered
Nov 23 15:00:51 np0005532763 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 23 15:00:51 np0005532763 kernel: io scheduler mq-deadline registered
Nov 23 15:00:51 np0005532763 kernel: io scheduler kyber registered
Nov 23 15:00:51 np0005532763 kernel: io scheduler bfq registered
Nov 23 15:00:51 np0005532763 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 23 15:00:51 np0005532763 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 23 15:00:51 np0005532763 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 23 15:00:51 np0005532763 kernel: ACPI: button: Power Button [PWRF]
Nov 23 15:00:51 np0005532763 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 23 15:00:51 np0005532763 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 23 15:00:51 np0005532763 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 23 15:00:51 np0005532763 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 23 15:00:51 np0005532763 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 23 15:00:51 np0005532763 kernel: Non-volatile memory driver v1.3
Nov 23 15:00:51 np0005532763 kernel: rdac: device handler registered
Nov 23 15:00:51 np0005532763 kernel: hp_sw: device handler registered
Nov 23 15:00:51 np0005532763 kernel: emc: device handler registered
Nov 23 15:00:51 np0005532763 kernel: alua: device handler registered
Nov 23 15:00:51 np0005532763 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 23 15:00:51 np0005532763 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 23 15:00:51 np0005532763 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 23 15:00:51 np0005532763 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 23 15:00:51 np0005532763 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 23 15:00:51 np0005532763 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 23 15:00:51 np0005532763 kernel: usb usb1: Product: UHCI Host Controller
Nov 23 15:00:51 np0005532763 kernel: usb usb1: Manufacturer: Linux 5.14.0-639.el9.x86_64 uhci_hcd
Nov 23 15:00:51 np0005532763 kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 23 15:00:51 np0005532763 kernel: hub 1-0:1.0: USB hub found
Nov 23 15:00:51 np0005532763 kernel: hub 1-0:1.0: 2 ports detected
Nov 23 15:00:51 np0005532763 kernel: usbcore: registered new interface driver usbserial_generic
Nov 23 15:00:51 np0005532763 kernel: usbserial: USB Serial support registered for generic
Nov 23 15:00:51 np0005532763 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 23 15:00:51 np0005532763 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 23 15:00:51 np0005532763 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 23 15:00:51 np0005532763 kernel: mousedev: PS/2 mouse device common for all mice
Nov 23 15:00:51 np0005532763 kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 23 15:00:51 np0005532763 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 23 15:00:51 np0005532763 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 23 15:00:51 np0005532763 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 23 15:00:51 np0005532763 kernel: rtc_cmos 00:04: registered as rtc0
Nov 23 15:00:51 np0005532763 kernel: rtc_cmos 00:04: setting system clock to 2025-11-23T20:00:50 UTC (1763928050)
Nov 23 15:00:51 np0005532763 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 23 15:00:51 np0005532763 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 23 15:00:51 np0005532763 kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 23 15:00:51 np0005532763 kernel: usbcore: registered new interface driver usbhid
Nov 23 15:00:51 np0005532763 kernel: usbhid: USB HID core driver
Nov 23 15:00:51 np0005532763 kernel: drop_monitor: Initializing network drop monitor service
Nov 23 15:00:51 np0005532763 kernel: Initializing XFRM netlink socket
Nov 23 15:00:51 np0005532763 kernel: NET: Registered PF_INET6 protocol family
Nov 23 15:00:51 np0005532763 kernel: Segment Routing with IPv6
Nov 23 15:00:51 np0005532763 kernel: NET: Registered PF_PACKET protocol family
Nov 23 15:00:51 np0005532763 kernel: mpls_gso: MPLS GSO support
Nov 23 15:00:51 np0005532763 kernel: IPI shorthand broadcast: enabled
Nov 23 15:00:51 np0005532763 kernel: AVX2 version of gcm_enc/dec engaged.
Nov 23 15:00:51 np0005532763 kernel: AES CTR mode by8 optimization enabled
Nov 23 15:00:51 np0005532763 kernel: sched_clock: Marking stable (1463014880, 189974770)->(1815809840, -162820190)
Nov 23 15:00:51 np0005532763 kernel: registered taskstats version 1
Nov 23 15:00:51 np0005532763 kernel: Loading compiled-in X.509 certificates
Nov 23 15:00:51 np0005532763 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: f7751431c703da8a75244ce96aad68601cf1c188'
Nov 23 15:00:51 np0005532763 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 23 15:00:51 np0005532763 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 23 15:00:51 np0005532763 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 23 15:00:51 np0005532763 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 23 15:00:51 np0005532763 kernel: Demotion targets for Node 0: null
Nov 23 15:00:51 np0005532763 kernel: page_owner is disabled
Nov 23 15:00:51 np0005532763 kernel: Key type .fscrypt registered
Nov 23 15:00:51 np0005532763 kernel: Key type fscrypt-provisioning registered
Nov 23 15:00:51 np0005532763 kernel: Key type big_key registered
Nov 23 15:00:51 np0005532763 kernel: Key type encrypted registered
Nov 23 15:00:51 np0005532763 kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 23 15:00:51 np0005532763 kernel: Loading compiled-in module X.509 certificates
Nov 23 15:00:51 np0005532763 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: f7751431c703da8a75244ce96aad68601cf1c188'
Nov 23 15:00:51 np0005532763 kernel: ima: Allocated hash algorithm: sha256
Nov 23 15:00:51 np0005532763 kernel: ima: No architecture policies found
Nov 23 15:00:51 np0005532763 kernel: evm: Initialising EVM extended attributes:
Nov 23 15:00:51 np0005532763 kernel: evm: security.selinux
Nov 23 15:00:51 np0005532763 kernel: evm: security.SMACK64 (disabled)
Nov 23 15:00:51 np0005532763 kernel: evm: security.SMACK64EXEC (disabled)
Nov 23 15:00:51 np0005532763 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 23 15:00:51 np0005532763 kernel: evm: security.SMACK64MMAP (disabled)
Nov 23 15:00:51 np0005532763 kernel: evm: security.apparmor (disabled)
Nov 23 15:00:51 np0005532763 kernel: evm: security.ima
Nov 23 15:00:51 np0005532763 kernel: evm: security.capability
Nov 23 15:00:51 np0005532763 kernel: evm: HMAC attrs: 0x1
Nov 23 15:00:51 np0005532763 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 23 15:00:51 np0005532763 kernel: Running certificate verification RSA selftest
Nov 23 15:00:51 np0005532763 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 23 15:00:51 np0005532763 kernel: Running certificate verification ECDSA selftest
Nov 23 15:00:51 np0005532763 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 23 15:00:51 np0005532763 kernel: clk: Disabling unused clocks
Nov 23 15:00:51 np0005532763 kernel: Freeing unused decrypted memory: 2028K
Nov 23 15:00:51 np0005532763 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 23 15:00:51 np0005532763 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 23 15:00:51 np0005532763 kernel: usb 1-1: Product: QEMU USB Tablet
Nov 23 15:00:51 np0005532763 kernel: usb 1-1: Manufacturer: QEMU
Nov 23 15:00:51 np0005532763 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 23 15:00:51 np0005532763 kernel: Freeing unused kernel image (initmem) memory: 4188K
Nov 23 15:00:51 np0005532763 kernel: Write protecting the kernel read-only data: 30720k
Nov 23 15:00:51 np0005532763 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 23 15:00:51 np0005532763 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 23 15:00:51 np0005532763 kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 23 15:00:51 np0005532763 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 23 15:00:51 np0005532763 kernel: Run /init as init process
Nov 23 15:00:51 np0005532763 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 23 15:00:51 np0005532763 systemd: Detected virtualization kvm.
Nov 23 15:00:51 np0005532763 systemd: Detected architecture x86-64.
Nov 23 15:00:51 np0005532763 systemd: Running in initrd.
Nov 23 15:00:51 np0005532763 systemd: No hostname configured, using default hostname.
Nov 23 15:00:51 np0005532763 systemd: Hostname set to <localhost>.
Nov 23 15:00:51 np0005532763 systemd: Initializing machine ID from VM UUID.
Nov 23 15:00:51 np0005532763 systemd: Queued start job for default target Initrd Default Target.
Nov 23 15:00:51 np0005532763 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 23 15:00:51 np0005532763 systemd: Reached target Local Encrypted Volumes.
Nov 23 15:00:51 np0005532763 systemd: Reached target Initrd /usr File System.
Nov 23 15:00:51 np0005532763 systemd: Reached target Local File Systems.
Nov 23 15:00:51 np0005532763 systemd: Reached target Path Units.
Nov 23 15:00:51 np0005532763 systemd: Reached target Slice Units.
Nov 23 15:00:51 np0005532763 systemd: Reached target Swaps.
Nov 23 15:00:51 np0005532763 systemd: Reached target Timer Units.
Nov 23 15:00:51 np0005532763 systemd: Listening on D-Bus System Message Bus Socket.
Nov 23 15:00:51 np0005532763 systemd: Listening on Journal Socket (/dev/log).
Nov 23 15:00:51 np0005532763 systemd: Listening on Journal Socket.
Nov 23 15:00:51 np0005532763 systemd: Listening on udev Control Socket.
Nov 23 15:00:51 np0005532763 systemd: Listening on udev Kernel Socket.
Nov 23 15:00:51 np0005532763 systemd: Reached target Socket Units.
Nov 23 15:00:51 np0005532763 systemd: Starting Create List of Static Device Nodes...
Nov 23 15:00:51 np0005532763 systemd: Starting Journal Service...
Nov 23 15:00:51 np0005532763 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 23 15:00:51 np0005532763 systemd: Starting Apply Kernel Variables...
Nov 23 15:00:51 np0005532763 systemd: Starting Create System Users...
Nov 23 15:00:51 np0005532763 systemd: Starting Setup Virtual Console...
Nov 23 15:00:51 np0005532763 systemd: Finished Create List of Static Device Nodes.
Nov 23 15:00:51 np0005532763 systemd: Finished Apply Kernel Variables.
Nov 23 15:00:51 np0005532763 systemd: Finished Create System Users.
Nov 23 15:00:51 np0005532763 systemd-journald[307]: Journal started
Nov 23 15:00:51 np0005532763 systemd-journald[307]: Runtime Journal (/run/log/journal/e38e4d8bcfb84d2487523d68cd15bb48) is 8.0M, max 153.6M, 145.6M free.
Nov 23 15:00:51 np0005532763 systemd-sysusers[312]: Creating group 'users' with GID 100.
Nov 23 15:00:51 np0005532763 systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Nov 23 15:00:51 np0005532763 systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 23 15:00:51 np0005532763 systemd: Started Journal Service.
Nov 23 15:00:51 np0005532763 systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 23 15:00:51 np0005532763 systemd[1]: Starting Create Volatile Files and Directories...
Nov 23 15:00:51 np0005532763 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 23 15:00:51 np0005532763 systemd[1]: Finished Setup Virtual Console.
Nov 23 15:00:51 np0005532763 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 23 15:00:51 np0005532763 systemd[1]: Starting dracut cmdline hook...
Nov 23 15:00:51 np0005532763 dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Nov 23 15:00:51 np0005532763 dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 23 15:00:51 np0005532763 systemd[1]: Finished Create Volatile Files and Directories.
Nov 23 15:00:51 np0005532763 systemd[1]: Finished dracut cmdline hook.
Nov 23 15:00:51 np0005532763 systemd[1]: Starting dracut pre-udev hook...
Nov 23 15:00:51 np0005532763 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 23 15:00:51 np0005532763 kernel: device-mapper: uevent: version 1.0.3
Nov 23 15:00:51 np0005532763 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 23 15:00:52 np0005532763 kernel: RPC: Registered named UNIX socket transport module.
Nov 23 15:00:52 np0005532763 kernel: RPC: Registered udp transport module.
Nov 23 15:00:52 np0005532763 kernel: RPC: Registered tcp transport module.
Nov 23 15:00:52 np0005532763 kernel: RPC: Registered tcp-with-tls transport module.
Nov 23 15:00:52 np0005532763 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 23 15:00:52 np0005532763 rpc.statd[444]: Version 2.5.4 starting
Nov 23 15:00:52 np0005532763 rpc.statd[444]: Initializing NSM state
Nov 23 15:00:52 np0005532763 rpc.idmapd[449]: Setting log level to 0
Nov 23 15:00:52 np0005532763 systemd[1]: Finished dracut pre-udev hook.
Nov 23 15:00:52 np0005532763 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 23 15:00:52 np0005532763 systemd-udevd[462]: Using default interface naming scheme 'rhel-9.0'.
Nov 23 15:00:52 np0005532763 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 23 15:00:52 np0005532763 systemd[1]: Starting dracut pre-trigger hook...
Nov 23 15:00:52 np0005532763 systemd[1]: Finished dracut pre-trigger hook.
Nov 23 15:00:52 np0005532763 systemd[1]: Starting Coldplug All udev Devices...
Nov 23 15:00:52 np0005532763 systemd[1]: Created slice Slice /system/modprobe.
Nov 23 15:00:52 np0005532763 systemd[1]: Starting Load Kernel Module configfs...
Nov 23 15:00:52 np0005532763 systemd[1]: Finished Coldplug All udev Devices.
Nov 23 15:00:52 np0005532763 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 23 15:00:52 np0005532763 systemd[1]: Finished Load Kernel Module configfs.
Nov 23 15:00:52 np0005532763 systemd[1]: Mounting Kernel Configuration File System...
Nov 23 15:00:52 np0005532763 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 23 15:00:52 np0005532763 systemd[1]: Reached target Network.
Nov 23 15:00:52 np0005532763 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 23 15:00:52 np0005532763 systemd[1]: Starting dracut initqueue hook...
Nov 23 15:00:52 np0005532763 systemd[1]: Mounted Kernel Configuration File System.
Nov 23 15:00:52 np0005532763 systemd[1]: Reached target System Initialization.
Nov 23 15:00:52 np0005532763 systemd[1]: Reached target Basic System.
Nov 23 15:00:52 np0005532763 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 23 15:00:52 np0005532763 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 23 15:00:52 np0005532763 kernel: vda: vda1
Nov 23 15:00:52 np0005532763 kernel: scsi host0: ata_piix
Nov 23 15:00:52 np0005532763 kernel: scsi host1: ata_piix
Nov 23 15:00:52 np0005532763 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 23 15:00:52 np0005532763 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 23 15:00:52 np0005532763 systemd[1]: Found device /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 23 15:00:52 np0005532763 systemd[1]: Reached target Initrd Root Device.
Nov 23 15:00:52 np0005532763 kernel: ata1: found unknown device (class 0)
Nov 23 15:00:52 np0005532763 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 23 15:00:52 np0005532763 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 23 15:00:52 np0005532763 systemd-udevd[489]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 15:00:52 np0005532763 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 23 15:00:52 np0005532763 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 23 15:00:52 np0005532763 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 23 15:00:53 np0005532763 systemd[1]: Finished dracut initqueue hook.
Nov 23 15:00:53 np0005532763 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 23 15:00:53 np0005532763 systemd[1]: Reached target Remote Encrypted Volumes.
Nov 23 15:00:53 np0005532763 systemd[1]: Reached target Remote File Systems.
Nov 23 15:00:53 np0005532763 systemd[1]: Starting dracut pre-mount hook...
Nov 23 15:00:53 np0005532763 systemd[1]: Finished dracut pre-mount hook.
Nov 23 15:00:53 np0005532763 systemd[1]: Starting File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709...
Nov 23 15:00:53 np0005532763 systemd-fsck[561]: /usr/sbin/fsck.xfs: XFS file system.
Nov 23 15:00:53 np0005532763 systemd[1]: Finished File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 23 15:00:53 np0005532763 systemd[1]: Mounting /sysroot...
Nov 23 15:00:53 np0005532763 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 23 15:00:53 np0005532763 kernel: XFS (vda1): Mounting V5 Filesystem 47e3724e-7a1b-439a-9543-b98c9a290709
Nov 23 15:00:54 np0005532763 kernel: XFS (vda1): Ending clean mount
Nov 23 15:00:54 np0005532763 systemd[1]: Mounted /sysroot.
Nov 23 15:00:54 np0005532763 systemd[1]: Reached target Initrd Root File System.
Nov 23 15:00:54 np0005532763 systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 23 15:00:54 np0005532763 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 23 15:00:54 np0005532763 systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 23 15:00:54 np0005532763 systemd[1]: Reached target Initrd File Systems.
Nov 23 15:00:54 np0005532763 systemd[1]: Reached target Initrd Default Target.
Nov 23 15:00:54 np0005532763 systemd[1]: Starting dracut mount hook...
Nov 23 15:00:54 np0005532763 systemd[1]: Finished dracut mount hook.
Nov 23 15:00:54 np0005532763 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 23 15:00:55 np0005532763 rpc.idmapd[449]: exiting on signal 15
Nov 23 15:00:55 np0005532763 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 23 15:00:55 np0005532763 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped target Network.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped target Timer Units.
Nov 23 15:00:55 np0005532763 systemd[1]: dbus.socket: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 23 15:00:55 np0005532763 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped target Initrd Default Target.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped target Basic System.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped target Initrd Root Device.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped target Initrd /usr File System.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped target Path Units.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped target Remote File Systems.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped target Slice Units.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped target Socket Units.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped target System Initialization.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped target Local File Systems.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped target Swaps.
Nov 23 15:00:55 np0005532763 systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped dracut mount hook.
Nov 23 15:00:55 np0005532763 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped dracut pre-mount hook.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped target Local Encrypted Volumes.
Nov 23 15:00:55 np0005532763 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 23 15:00:55 np0005532763 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped dracut initqueue hook.
Nov 23 15:00:55 np0005532763 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped Apply Kernel Variables.
Nov 23 15:00:55 np0005532763 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped Create Volatile Files and Directories.
Nov 23 15:00:55 np0005532763 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped Coldplug All udev Devices.
Nov 23 15:00:55 np0005532763 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped dracut pre-trigger hook.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 23 15:00:55 np0005532763 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped Setup Virtual Console.
Nov 23 15:00:55 np0005532763 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 23 15:00:55 np0005532763 systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 23 15:00:55 np0005532763 systemd[1]: systemd-udevd.service: Consumed 1.019s CPU time.
Nov 23 15:00:55 np0005532763 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Closed udev Control Socket.
Nov 23 15:00:55 np0005532763 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Closed udev Kernel Socket.
Nov 23 15:00:55 np0005532763 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped dracut pre-udev hook.
Nov 23 15:00:55 np0005532763 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped dracut cmdline hook.
Nov 23 15:00:55 np0005532763 systemd[1]: Starting Cleanup udev Database...
Nov 23 15:00:55 np0005532763 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 23 15:00:55 np0005532763 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped Create List of Static Device Nodes.
Nov 23 15:00:55 np0005532763 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Stopped Create System Users.
Nov 23 15:00:55 np0005532763 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 23 15:00:55 np0005532763 systemd[1]: Finished Cleanup udev Database.
Nov 23 15:00:55 np0005532763 systemd[1]: Reached target Switch Root.
Nov 23 15:00:55 np0005532763 systemd[1]: Starting Switch Root...
Nov 23 15:00:55 np0005532763 systemd[1]: Switching root.
Nov 23 15:00:55 np0005532763 systemd-journald[307]: Journal stopped
Nov 23 15:00:57 np0005532763 systemd-journald: Received SIGTERM from PID 1 (systemd).
Nov 23 15:00:57 np0005532763 kernel: audit: type=1404 audit(1763928055.821:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 23 15:00:57 np0005532763 kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 15:00:57 np0005532763 kernel: SELinux:  policy capability open_perms=1
Nov 23 15:00:57 np0005532763 kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 15:00:57 np0005532763 kernel: SELinux:  policy capability always_check_network=0
Nov 23 15:00:57 np0005532763 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 15:00:57 np0005532763 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 15:00:57 np0005532763 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 15:00:57 np0005532763 kernel: audit: type=1403 audit(1763928056.086:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 23 15:00:57 np0005532763 systemd: Successfully loaded SELinux policy in 282.834ms.
Nov 23 15:00:57 np0005532763 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 55.416ms.
Nov 23 15:00:57 np0005532763 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 23 15:00:57 np0005532763 systemd: Detected virtualization kvm.
Nov 23 15:00:57 np0005532763 systemd: Detected architecture x86-64.
Nov 23 15:00:57 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:00:57 np0005532763 systemd: initrd-switch-root.service: Deactivated successfully.
Nov 23 15:00:57 np0005532763 systemd: Stopped Switch Root.
Nov 23 15:00:57 np0005532763 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 23 15:00:57 np0005532763 systemd: Created slice Slice /system/getty.
Nov 23 15:00:57 np0005532763 systemd: Created slice Slice /system/serial-getty.
Nov 23 15:00:57 np0005532763 systemd: Created slice Slice /system/sshd-keygen.
Nov 23 15:00:57 np0005532763 systemd: Created slice User and Session Slice.
Nov 23 15:00:57 np0005532763 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 23 15:00:57 np0005532763 systemd: Started Forward Password Requests to Wall Directory Watch.
Nov 23 15:00:57 np0005532763 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 23 15:00:57 np0005532763 systemd: Reached target Local Encrypted Volumes.
Nov 23 15:00:57 np0005532763 systemd: Stopped target Switch Root.
Nov 23 15:00:57 np0005532763 systemd: Stopped target Initrd File Systems.
Nov 23 15:00:57 np0005532763 systemd: Stopped target Initrd Root File System.
Nov 23 15:00:57 np0005532763 systemd: Reached target Local Integrity Protected Volumes.
Nov 23 15:00:57 np0005532763 systemd: Reached target Path Units.
Nov 23 15:00:57 np0005532763 systemd: Reached target rpc_pipefs.target.
Nov 23 15:00:57 np0005532763 systemd: Reached target Slice Units.
Nov 23 15:00:57 np0005532763 systemd: Reached target Swaps.
Nov 23 15:00:57 np0005532763 systemd: Reached target Local Verity Protected Volumes.
Nov 23 15:00:57 np0005532763 systemd: Listening on RPCbind Server Activation Socket.
Nov 23 15:00:57 np0005532763 systemd: Reached target RPC Port Mapper.
Nov 23 15:00:57 np0005532763 systemd: Listening on Process Core Dump Socket.
Nov 23 15:00:57 np0005532763 systemd: Listening on initctl Compatibility Named Pipe.
Nov 23 15:00:57 np0005532763 systemd: Listening on udev Control Socket.
Nov 23 15:00:57 np0005532763 systemd: Listening on udev Kernel Socket.
Nov 23 15:00:57 np0005532763 systemd: Mounting Huge Pages File System...
Nov 23 15:00:57 np0005532763 systemd: Mounting POSIX Message Queue File System...
Nov 23 15:00:57 np0005532763 systemd: Mounting Kernel Debug File System...
Nov 23 15:00:57 np0005532763 systemd: Mounting Kernel Trace File System...
Nov 23 15:00:57 np0005532763 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 23 15:00:57 np0005532763 systemd: Starting Create List of Static Device Nodes...
Nov 23 15:00:57 np0005532763 systemd: Starting Load Kernel Module configfs...
Nov 23 15:00:57 np0005532763 systemd: Starting Load Kernel Module drm...
Nov 23 15:00:57 np0005532763 systemd: Starting Load Kernel Module efi_pstore...
Nov 23 15:00:57 np0005532763 systemd: Starting Load Kernel Module fuse...
Nov 23 15:00:57 np0005532763 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 23 15:00:57 np0005532763 systemd: systemd-fsck-root.service: Deactivated successfully.
Nov 23 15:00:57 np0005532763 systemd: Stopped File System Check on Root Device.
Nov 23 15:00:57 np0005532763 systemd: Stopped Journal Service.
Nov 23 15:00:57 np0005532763 systemd: Starting Journal Service...
Nov 23 15:00:57 np0005532763 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 23 15:00:57 np0005532763 systemd: Starting Generate network units from Kernel command line...
Nov 23 15:00:57 np0005532763 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 23 15:00:57 np0005532763 systemd: Starting Remount Root and Kernel File Systems...
Nov 23 15:00:57 np0005532763 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 23 15:00:57 np0005532763 systemd: Starting Apply Kernel Variables...
Nov 23 15:00:57 np0005532763 kernel: fuse: init (API version 7.37)
Nov 23 15:00:57 np0005532763 systemd: Starting Coldplug All udev Devices...
Nov 23 15:00:57 np0005532763 systemd: Mounted Huge Pages File System.
Nov 23 15:00:57 np0005532763 systemd: Mounted POSIX Message Queue File System.
Nov 23 15:00:57 np0005532763 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 23 15:00:57 np0005532763 systemd: Mounted Kernel Debug File System.
Nov 23 15:00:57 np0005532763 systemd: Mounted Kernel Trace File System.
Nov 23 15:00:57 np0005532763 systemd: Finished Create List of Static Device Nodes.
Nov 23 15:00:57 np0005532763 systemd-journald[685]: Journal started
Nov 23 15:00:57 np0005532763 systemd-journald[685]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 23 15:00:57 np0005532763 systemd[1]: Queued start job for default target Multi-User System.
Nov 23 15:00:57 np0005532763 systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 23 15:00:57 np0005532763 systemd: Started Journal Service.
Nov 23 15:00:57 np0005532763 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 23 15:00:57 np0005532763 systemd[1]: Finished Load Kernel Module configfs.
Nov 23 15:00:57 np0005532763 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 23 15:00:57 np0005532763 systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 23 15:00:57 np0005532763 kernel: ACPI: bus type drm_connector registered
Nov 23 15:00:57 np0005532763 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 23 15:00:57 np0005532763 systemd[1]: Finished Load Kernel Module fuse.
Nov 23 15:00:57 np0005532763 systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 23 15:00:57 np0005532763 systemd[1]: Finished Load Kernel Module drm.
Nov 23 15:00:57 np0005532763 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 23 15:00:57 np0005532763 systemd[1]: Finished Generate network units from Kernel command line.
Nov 23 15:00:57 np0005532763 systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 23 15:00:57 np0005532763 systemd[1]: Finished Apply Kernel Variables.
Nov 23 15:00:57 np0005532763 systemd[1]: Mounting FUSE Control File System...
Nov 23 15:00:57 np0005532763 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 23 15:00:57 np0005532763 systemd[1]: Starting Rebuild Hardware Database...
Nov 23 15:00:57 np0005532763 systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 23 15:00:57 np0005532763 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 23 15:00:57 np0005532763 systemd[1]: Starting Load/Save OS Random Seed...
Nov 23 15:00:57 np0005532763 systemd[1]: Starting Create System Users...
Nov 23 15:00:57 np0005532763 systemd[1]: Mounted FUSE Control File System.
Nov 23 15:00:57 np0005532763 systemd-journald[685]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 23 15:00:57 np0005532763 systemd-journald[685]: Received client request to flush runtime journal.
Nov 23 15:00:57 np0005532763 systemd[1]: Finished Coldplug All udev Devices.
Nov 23 15:00:57 np0005532763 systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 23 15:00:57 np0005532763 systemd[1]: Finished Load/Save OS Random Seed.
Nov 23 15:00:57 np0005532763 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 23 15:00:57 np0005532763 systemd[1]: Finished Create System Users.
Nov 23 15:00:57 np0005532763 systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 23 15:00:58 np0005532763 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 23 15:00:58 np0005532763 systemd[1]: Reached target Preparation for Local File Systems.
Nov 23 15:00:58 np0005532763 systemd[1]: Reached target Local File Systems.
Nov 23 15:00:58 np0005532763 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 23 15:00:58 np0005532763 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 23 15:00:58 np0005532763 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 23 15:00:58 np0005532763 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 23 15:00:58 np0005532763 systemd[1]: Starting Automatic Boot Loader Update...
Nov 23 15:00:58 np0005532763 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 23 15:00:58 np0005532763 systemd[1]: Starting Create Volatile Files and Directories...
Nov 23 15:00:58 np0005532763 bootctl[705]: Couldn't find EFI system partition, skipping.
Nov 23 15:00:58 np0005532763 systemd[1]: Finished Automatic Boot Loader Update.
Nov 23 15:00:58 np0005532763 systemd[1]: Finished Create Volatile Files and Directories.
Nov 23 15:00:58 np0005532763 systemd[1]: Starting Security Auditing Service...
Nov 23 15:00:58 np0005532763 systemd[1]: Starting RPC Bind...
Nov 23 15:00:58 np0005532763 systemd[1]: Starting Rebuild Journal Catalog...
Nov 23 15:00:58 np0005532763 auditd[711]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 23 15:00:58 np0005532763 auditd[711]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 23 15:00:58 np0005532763 systemd[1]: Finished Rebuild Journal Catalog.
Nov 23 15:00:58 np0005532763 systemd[1]: Started RPC Bind.
Nov 23 15:00:58 np0005532763 augenrules[716]: /sbin/augenrules: No change
Nov 23 15:00:58 np0005532763 augenrules[731]: No rules
Nov 23 15:00:58 np0005532763 augenrules[731]: enabled 1
Nov 23 15:00:58 np0005532763 augenrules[731]: failure 1
Nov 23 15:00:58 np0005532763 augenrules[731]: pid 711
Nov 23 15:00:58 np0005532763 augenrules[731]: rate_limit 0
Nov 23 15:00:58 np0005532763 augenrules[731]: backlog_limit 8192
Nov 23 15:00:58 np0005532763 augenrules[731]: lost 0
Nov 23 15:00:58 np0005532763 augenrules[731]: backlog 0
Nov 23 15:00:58 np0005532763 augenrules[731]: backlog_wait_time 60000
Nov 23 15:00:58 np0005532763 augenrules[731]: backlog_wait_time_actual 0
Nov 23 15:00:58 np0005532763 augenrules[731]: enabled 1
Nov 23 15:00:58 np0005532763 augenrules[731]: failure 1
Nov 23 15:00:58 np0005532763 augenrules[731]: pid 711
Nov 23 15:00:58 np0005532763 augenrules[731]: rate_limit 0
Nov 23 15:00:58 np0005532763 augenrules[731]: backlog_limit 8192
Nov 23 15:00:58 np0005532763 augenrules[731]: lost 0
Nov 23 15:00:58 np0005532763 augenrules[731]: backlog 2
Nov 23 15:00:58 np0005532763 augenrules[731]: backlog_wait_time 60000
Nov 23 15:00:58 np0005532763 augenrules[731]: backlog_wait_time_actual 0
Nov 23 15:00:58 np0005532763 augenrules[731]: enabled 1
Nov 23 15:00:58 np0005532763 augenrules[731]: failure 1
Nov 23 15:00:58 np0005532763 augenrules[731]: pid 711
Nov 23 15:00:58 np0005532763 augenrules[731]: rate_limit 0
Nov 23 15:00:58 np0005532763 augenrules[731]: backlog_limit 8192
Nov 23 15:00:58 np0005532763 augenrules[731]: lost 0
Nov 23 15:00:58 np0005532763 augenrules[731]: backlog 4
Nov 23 15:00:58 np0005532763 augenrules[731]: backlog_wait_time 60000
Nov 23 15:00:58 np0005532763 augenrules[731]: backlog_wait_time_actual 0
Nov 23 15:00:58 np0005532763 systemd[1]: Started Security Auditing Service.
Nov 23 15:00:58 np0005532763 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 23 15:00:58 np0005532763 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 23 15:00:58 np0005532763 systemd[1]: Finished Rebuild Hardware Database.
Nov 23 15:00:58 np0005532763 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 23 15:00:58 np0005532763 systemd-udevd[739]: Using default interface naming scheme 'rhel-9.0'.
Nov 23 15:00:58 np0005532763 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 23 15:00:58 np0005532763 systemd[1]: Starting Load Kernel Module configfs...
Nov 23 15:00:58 np0005532763 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 23 15:00:58 np0005532763 systemd[1]: Finished Load Kernel Module configfs.
Nov 23 15:00:58 np0005532763 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 23 15:00:58 np0005532763 systemd-udevd[743]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 15:00:58 np0005532763 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 23 15:00:58 np0005532763 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 23 15:00:58 np0005532763 systemd[1]: Starting Update is Completed...
Nov 23 15:00:58 np0005532763 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 23 15:00:58 np0005532763 systemd[1]: Finished Update is Completed.
Nov 23 15:00:58 np0005532763 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 23 15:00:58 np0005532763 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 23 15:00:58 np0005532763 systemd[1]: Reached target System Initialization.
Nov 23 15:00:58 np0005532763 systemd[1]: Started dnf makecache --timer.
Nov 23 15:00:58 np0005532763 systemd[1]: Started Daily rotation of log files.
Nov 23 15:00:58 np0005532763 systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 23 15:00:58 np0005532763 systemd[1]: Reached target Timer Units.
Nov 23 15:00:58 np0005532763 systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 23 15:00:58 np0005532763 kernel: kvm_amd: TSC scaling supported
Nov 23 15:00:58 np0005532763 kernel: kvm_amd: Nested Virtualization enabled
Nov 23 15:00:58 np0005532763 kernel: kvm_amd: Nested Paging enabled
Nov 23 15:00:58 np0005532763 kernel: kvm_amd: LBR virtualization supported
Nov 23 15:00:58 np0005532763 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 23 15:00:58 np0005532763 systemd[1]: Reached target Socket Units.
Nov 23 15:00:58 np0005532763 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 23 15:00:58 np0005532763 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 23 15:00:58 np0005532763 kernel: Console: switching to colour dummy device 80x25
Nov 23 15:00:58 np0005532763 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 23 15:00:58 np0005532763 kernel: [drm] features: -context_init
Nov 23 15:00:58 np0005532763 kernel: [drm] number of scanouts: 1
Nov 23 15:00:58 np0005532763 kernel: [drm] number of cap sets: 0
Nov 23 15:00:58 np0005532763 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 23 15:00:58 np0005532763 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 23 15:00:58 np0005532763 kernel: Console: switching to colour frame buffer device 128x48
Nov 23 15:00:58 np0005532763 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 23 15:00:58 np0005532763 systemd[1]: Starting D-Bus System Message Bus...
Nov 23 15:00:58 np0005532763 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 23 15:00:58 np0005532763 systemd[1]: Started D-Bus System Message Bus.
Nov 23 15:00:58 np0005532763 dbus-broker-lau[794]: Ready
Nov 23 15:00:58 np0005532763 systemd[1]: Reached target Basic System.
Nov 23 15:00:59 np0005532763 systemd[1]: Starting NTP client/server...
Nov 23 15:00:59 np0005532763 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 23 15:00:59 np0005532763 systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 23 15:00:59 np0005532763 systemd[1]: Starting IPv4 firewall with iptables...
Nov 23 15:00:59 np0005532763 systemd[1]: Started irqbalance daemon.
Nov 23 15:00:59 np0005532763 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 23 15:00:59 np0005532763 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 15:00:59 np0005532763 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 15:00:59 np0005532763 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 15:00:59 np0005532763 systemd[1]: Reached target sshd-keygen.target.
Nov 23 15:00:59 np0005532763 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 23 15:00:59 np0005532763 systemd[1]: Reached target User and Group Name Lookups.
Nov 23 15:00:59 np0005532763 systemd[1]: Starting User Login Management...
Nov 23 15:00:59 np0005532763 systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 23 15:00:59 np0005532763 chronyd[837]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 23 15:00:59 np0005532763 chronyd[837]: Loaded 0 symmetric keys
Nov 23 15:00:59 np0005532763 chronyd[837]: Using right/UTC timezone to obtain leap second data
Nov 23 15:00:59 np0005532763 chronyd[837]: Loaded seccomp filter (level 2)
Nov 23 15:00:59 np0005532763 systemd[1]: Started NTP client/server.
Nov 23 15:00:59 np0005532763 systemd-logind[830]: New seat seat0.
Nov 23 15:00:59 np0005532763 systemd-logind[830]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 23 15:00:59 np0005532763 systemd-logind[830]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 23 15:00:59 np0005532763 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 23 15:00:59 np0005532763 systemd[1]: Started User Login Management.
Nov 23 15:00:59 np0005532763 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 23 15:00:59 np0005532763 iptables.init[824]: iptables: Applying firewall rules: [  OK  ]
Nov 23 15:00:59 np0005532763 systemd[1]: Finished IPv4 firewall with iptables.
Nov 23 15:00:59 np0005532763 cloud-init[847]: Cloud-init v. 24.4-7.el9 running 'init-local' at Sun, 23 Nov 2025 20:00:59 +0000. Up 10.54 seconds.
Nov 23 15:00:59 np0005532763 systemd[1]: run-cloud\x2dinit-tmp-tmp9oworr7s.mount: Deactivated successfully.
Nov 23 15:00:59 np0005532763 systemd[1]: Starting Hostname Service...
Nov 23 15:01:00 np0005532763 systemd[1]: Started Hostname Service.
Nov 23 15:01:00 np0005532763 systemd-hostnamed[861]: Hostname set to <np0005532763.novalocal> (static)
Nov 23 15:01:00 np0005532763 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 23 15:01:00 np0005532763 systemd[1]: Reached target Preparation for Network.
Nov 23 15:01:00 np0005532763 systemd[1]: Starting Network Manager...
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2244] NetworkManager (version 1.54.1-1.el9) is starting... (boot:a896f7ce-22ce-43ea-a9ba-7e128facc2bb)
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2251] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2484] manager[0x55a7698a9080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2555] hostname: hostname: using hostnamed
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2555] hostname: static hostname changed from (none) to "np0005532763.novalocal"
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2558] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2686] manager[0x55a7698a9080]: rfkill: Wi-Fi hardware radio set enabled
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2687] manager[0x55a7698a9080]: rfkill: WWAN hardware radio set enabled
Nov 23 15:01:00 np0005532763 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2812] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2813] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2814] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2815] manager: Networking is enabled by state file
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2822] settings: Loaded settings plugin: keyfile (internal)
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2852] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2884] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2909] dhcp: init: Using DHCP client 'internal'
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2912] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2928] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2942] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2951] device (lo): Activation: starting connection 'lo' (ac07c606-0f80-4608-b8dd-99a45b6a547d)
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2961] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2964] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.2998] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3002] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3004] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3006] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3007] device (eth0): carrier: link connected
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3010] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3016] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3023] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3027] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3028] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3031] manager: NetworkManager state is now CONNECTING
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3032] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 23 15:01:00 np0005532763 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3038] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3040] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 23 15:01:00 np0005532763 systemd[1]: Started Network Manager.
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3080] dhcp4 (eth0): state changed new lease, address=38.102.83.111
Nov 23 15:01:00 np0005532763 systemd[1]: Reached target Network.
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3091] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3120] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 23 15:01:00 np0005532763 systemd[1]: Starting Network Manager Wait Online...
Nov 23 15:01:00 np0005532763 systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 23 15:01:00 np0005532763 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3286] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3289] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3293] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3299] device (lo): Activation: successful, device activated.
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3305] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3309] manager: NetworkManager state is now CONNECTED_SITE
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3313] device (eth0): Activation: successful, device activated.
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3319] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 23 15:01:00 np0005532763 NetworkManager[865]: <info>  [1763928060.3322] manager: startup complete
Nov 23 15:01:00 np0005532763 systemd[1]: Finished Network Manager Wait Online.
Nov 23 15:01:00 np0005532763 systemd[1]: Starting Cloud-init: Network Stage...
Nov 23 15:01:00 np0005532763 systemd[1]: Started GSSAPI Proxy Daemon.
Nov 23 15:01:00 np0005532763 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 23 15:01:00 np0005532763 systemd[1]: Reached target NFS client services.
Nov 23 15:01:00 np0005532763 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 23 15:01:00 np0005532763 systemd[1]: Reached target Remote File Systems.
Nov 23 15:01:00 np0005532763 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 23 15:01:00 np0005532763 cloud-init[928]: Cloud-init v. 24.4-7.el9 running 'init' at Sun, 23 Nov 2025 20:01:00 +0000. Up 11.54 seconds.
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: |  eth0  | True |        38.102.83.111         | 255.255.255.0 | global | fa:16:3e:5b:f4:b3 |
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: |  eth0  | True | fe80::f816:3eff:fe5b:f4b3/64 |       .       |  link  | fa:16:3e:5b:f4:b3 |
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 23 15:01:00 np0005532763 cloud-init[928]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 23 15:01:02 np0005532763 cloud-init[928]: Generating public/private rsa key pair.
Nov 23 15:01:02 np0005532763 cloud-init[928]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 23 15:01:02 np0005532763 cloud-init[928]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 23 15:01:02 np0005532763 cloud-init[928]: The key fingerprint is:
Nov 23 15:01:02 np0005532763 cloud-init[928]: SHA256:Pwt7hxrn5F3tpy+ZyjwZ2DxFUKpznXCg0/Lnff2N/S0 root@np0005532763.novalocal
Nov 23 15:01:02 np0005532763 cloud-init[928]: The key's randomart image is:
Nov 23 15:01:02 np0005532763 cloud-init[928]: +---[RSA 3072]----+
Nov 23 15:01:02 np0005532763 cloud-init[928]: |            oo.  |
Nov 23 15:01:02 np0005532763 cloud-init[928]: |           o o.  |
Nov 23 15:01:02 np0005532763 cloud-init[928]: |          + +..  |
Nov 23 15:01:02 np0005532763 cloud-init[928]: |           = +.. |
Nov 23 15:01:02 np0005532763 cloud-init[928]: |        S o+o.+  |
Nov 23 15:01:02 np0005532763 cloud-init[928]: |         ..o=o o.|
Nov 23 15:01:02 np0005532763 cloud-init[928]: |        o =. +oo=|
Nov 23 15:01:02 np0005532763 cloud-init[928]: |         Oo*+.E+*|
Nov 23 15:01:02 np0005532763 cloud-init[928]: |        oo+.=oo=@|
Nov 23 15:01:02 np0005532763 cloud-init[928]: +----[SHA256]-----+
Nov 23 15:01:02 np0005532763 cloud-init[928]: Generating public/private ecdsa key pair.
Nov 23 15:01:02 np0005532763 cloud-init[928]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 23 15:01:02 np0005532763 cloud-init[928]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 23 15:01:02 np0005532763 cloud-init[928]: The key fingerprint is:
Nov 23 15:01:02 np0005532763 cloud-init[928]: SHA256:5tu5WvGLKz8qRX14xn/v/gLNT/9LSHq/ryepbmqkt9g root@np0005532763.novalocal
Nov 23 15:01:02 np0005532763 cloud-init[928]: The key's randomart image is:
Nov 23 15:01:02 np0005532763 cloud-init[928]: +---[ECDSA 256]---+
Nov 23 15:01:02 np0005532763 cloud-init[928]: |                 |
Nov 23 15:01:02 np0005532763 cloud-init[928]: |                 |
Nov 23 15:01:02 np0005532763 cloud-init[928]: |         . o     |
Nov 23 15:01:02 np0005532763 cloud-init[928]: |        . o =    |
Nov 23 15:01:02 np0005532763 cloud-init[928]: |       .S .+ =   |
Nov 23 15:01:02 np0005532763 cloud-init[928]: |       o. .o+ = o|
Nov 23 15:01:02 np0005532763 cloud-init[928]: |       ..o...+ *o|
Nov 23 15:01:02 np0005532763 cloud-init[928]: |      . o*+oo.* =|
Nov 23 15:01:02 np0005532763 cloud-init[928]: |       .=BEO+..@%|
Nov 23 15:01:02 np0005532763 cloud-init[928]: +----[SHA256]-----+
Nov 23 15:01:02 np0005532763 cloud-init[928]: Generating public/private ed25519 key pair.
Nov 23 15:01:02 np0005532763 cloud-init[928]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 23 15:01:02 np0005532763 cloud-init[928]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 23 15:01:02 np0005532763 cloud-init[928]: The key fingerprint is:
Nov 23 15:01:02 np0005532763 cloud-init[928]: SHA256:4gDlKP2H6bcAJRMuf7++Vjy/sR2Cr0fPJmnnVc0YaZM root@np0005532763.novalocal
Nov 23 15:01:02 np0005532763 cloud-init[928]: The key's randomart image is:
Nov 23 15:01:02 np0005532763 cloud-init[928]: +--[ED25519 256]--+
Nov 23 15:01:02 np0005532763 cloud-init[928]: |  . .            |
Nov 23 15:01:02 np0005532763 cloud-init[928]: | o =           o |
Nov 23 15:01:02 np0005532763 cloud-init[928]: |o B o         E  |
Nov 23 15:01:02 np0005532763 cloud-init[928]: | + * o       . =.|
Nov 23 15:01:02 np0005532763 cloud-init[928]: |  o * o.S     . +|
Nov 23 15:01:02 np0005532763 cloud-init[928]: |   + = .+..    . |
Nov 23 15:01:02 np0005532763 cloud-init[928]: |    o +..+o+. .  |
Nov 23 15:01:02 np0005532763 cloud-init[928]: |     o.o .*==o   |
Nov 23 15:01:02 np0005532763 cloud-init[928]: |     o=..++*o    |
Nov 23 15:01:02 np0005532763 cloud-init[928]: +----[SHA256]-----+
Nov 23 15:01:02 np0005532763 sm-notify[1010]: Version 2.5.4 starting
Nov 23 15:01:02 np0005532763 systemd[1]: Finished Cloud-init: Network Stage.
Nov 23 15:01:02 np0005532763 systemd[1]: Reached target Cloud-config availability.
Nov 23 15:01:02 np0005532763 systemd[1]: Reached target Network is Online.
Nov 23 15:01:02 np0005532763 systemd[1]: Starting Cloud-init: Config Stage...
Nov 23 15:01:02 np0005532763 systemd[1]: Starting Crash recovery kernel arming...
Nov 23 15:01:02 np0005532763 systemd[1]: Starting Notify NFS peers of a restart...
Nov 23 15:01:02 np0005532763 systemd[1]: Starting System Logging Service...
Nov 23 15:01:02 np0005532763 systemd[1]: Starting OpenSSH server daemon...
Nov 23 15:01:02 np0005532763 systemd[1]: Starting Permit User Sessions...
Nov 23 15:01:02 np0005532763 systemd[1]: Started Notify NFS peers of a restart.
Nov 23 15:01:02 np0005532763 systemd[1]: Started OpenSSH server daemon.
Nov 23 15:01:02 np0005532763 systemd[1]: Finished Permit User Sessions.
Nov 23 15:01:02 np0005532763 systemd[1]: Started Command Scheduler.
Nov 23 15:01:02 np0005532763 systemd[1]: Started Getty on tty1.
Nov 23 15:01:02 np0005532763 systemd[1]: Started Serial Getty on ttyS0.
Nov 23 15:01:02 np0005532763 systemd[1]: Reached target Login Prompts.
Nov 23 15:01:02 np0005532763 rsyslogd[1011]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1011" x-info="https://www.rsyslog.com"] start
Nov 23 15:01:02 np0005532763 rsyslogd[1011]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 23 15:01:02 np0005532763 systemd[1]: Started System Logging Service.
Nov 23 15:01:02 np0005532763 systemd[1]: Reached target Multi-User System.
Nov 23 15:01:02 np0005532763 systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 23 15:01:02 np0005532763 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 23 15:01:02 np0005532763 systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 23 15:01:02 np0005532763 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 15:01:02 np0005532763 kdumpctl[1025]: kdump: No kdump initial ramdisk found.
Nov 23 15:01:02 np0005532763 kdumpctl[1025]: kdump: Rebuilding /boot/initramfs-5.14.0-639.el9.x86_64kdump.img
Nov 23 15:01:02 np0005532763 cloud-init[1117]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Sun, 23 Nov 2025 20:01:02 +0000. Up 13.44 seconds.
Nov 23 15:01:02 np0005532763 systemd[1]: Finished Cloud-init: Config Stage.
Nov 23 15:01:02 np0005532763 systemd[1]: Starting Cloud-init: Final Stage...
Nov 23 15:01:03 np0005532763 cloud-init[1276]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Sun, 23 Nov 2025 20:01:02 +0000. Up 13.87 seconds.
Nov 23 15:01:03 np0005532763 cloud-init[1292]: #############################################################
Nov 23 15:01:03 np0005532763 cloud-init[1295]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 23 15:01:03 np0005532763 dracut[1294]: dracut-057-102.git20250818.el9
Nov 23 15:01:03 np0005532763 cloud-init[1298]: 256 SHA256:5tu5WvGLKz8qRX14xn/v/gLNT/9LSHq/ryepbmqkt9g root@np0005532763.novalocal (ECDSA)
Nov 23 15:01:03 np0005532763 cloud-init[1303]: 256 SHA256:4gDlKP2H6bcAJRMuf7++Vjy/sR2Cr0fPJmnnVc0YaZM root@np0005532763.novalocal (ED25519)
Nov 23 15:01:03 np0005532763 cloud-init[1316]: 3072 SHA256:Pwt7hxrn5F3tpy+ZyjwZ2DxFUKpznXCg0/Lnff2N/S0 root@np0005532763.novalocal (RSA)
Nov 23 15:01:03 np0005532763 cloud-init[1317]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 23 15:01:03 np0005532763 cloud-init[1318]: #############################################################
Nov 23 15:01:03 np0005532763 cloud-init[1276]: Cloud-init v. 24.4-7.el9 finished at Sun, 23 Nov 2025 20:01:03 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 14.08 seconds
Nov 23 15:01:03 np0005532763 systemd[1]: Finished Cloud-init: Final Stage.
Nov 23 15:01:03 np0005532763 systemd[1]: Reached target Cloud-init target.
Nov 23 15:01:03 np0005532763 dracut[1299]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-639.el9.x86_64kdump.img 5.14.0-639.el9.x86_64
Nov 23 15:01:03 np0005532763 dracut[1299]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 23 15:01:03 np0005532763 dracut[1299]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 23 15:01:03 np0005532763 dracut[1299]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 23 15:01:03 np0005532763 dracut[1299]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 23 15:01:03 np0005532763 dracut[1299]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: memstrack is not available
Nov 23 15:01:04 np0005532763 dracut[1299]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 23 15:01:04 np0005532763 dracut[1299]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 23 15:01:05 np0005532763 dracut[1299]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 23 15:01:05 np0005532763 dracut[1299]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 23 15:01:05 np0005532763 dracut[1299]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 23 15:01:05 np0005532763 dracut[1299]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 23 15:01:05 np0005532763 dracut[1299]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 23 15:01:05 np0005532763 dracut[1299]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 23 15:01:05 np0005532763 dracut[1299]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 23 15:01:05 np0005532763 dracut[1299]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 23 15:01:05 np0005532763 dracut[1299]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 23 15:01:05 np0005532763 dracut[1299]: memstrack is not available
Nov 23 15:01:05 np0005532763 dracut[1299]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 23 15:01:05 np0005532763 chronyd[837]: Selected source 138.197.164.54 (2.centos.pool.ntp.org)
Nov 23 15:01:06 np0005532763 chronyd[837]: System clock wrong by 1.324959 seconds
Nov 23 15:01:06 np0005532763 chronyd[837]: System clock was stepped by 1.324959 seconds
Nov 23 15:01:06 np0005532763 chronyd[837]: System clock TAI offset set to 37 seconds
Nov 23 15:01:07 np0005532763 dracut[1299]: *** Including module: systemd ***
Nov 23 15:01:08 np0005532763 dracut[1299]: *** Including module: fips ***
Nov 23 15:01:08 np0005532763 chronyd[837]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Nov 23 15:01:08 np0005532763 dracut[1299]: *** Including module: systemd-initrd ***
Nov 23 15:01:08 np0005532763 dracut[1299]: *** Including module: i18n ***
Nov 23 15:01:08 np0005532763 dracut[1299]: *** Including module: drm ***
Nov 23 15:01:09 np0005532763 dracut[1299]: *** Including module: prefixdevname ***
Nov 23 15:01:09 np0005532763 dracut[1299]: *** Including module: kernel-modules ***
Nov 23 15:01:09 np0005532763 kernel: block vda: the capability attribute has been deprecated.
Nov 23 15:01:10 np0005532763 irqbalance[825]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 23 15:01:10 np0005532763 irqbalance[825]: IRQ 25 affinity is now unmanaged
Nov 23 15:01:10 np0005532763 irqbalance[825]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 23 15:01:10 np0005532763 irqbalance[825]: IRQ 31 affinity is now unmanaged
Nov 23 15:01:10 np0005532763 irqbalance[825]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 23 15:01:10 np0005532763 irqbalance[825]: IRQ 28 affinity is now unmanaged
Nov 23 15:01:10 np0005532763 irqbalance[825]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 23 15:01:10 np0005532763 irqbalance[825]: IRQ 32 affinity is now unmanaged
Nov 23 15:01:10 np0005532763 irqbalance[825]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 23 15:01:10 np0005532763 irqbalance[825]: IRQ 30 affinity is now unmanaged
Nov 23 15:01:10 np0005532763 irqbalance[825]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 23 15:01:10 np0005532763 irqbalance[825]: IRQ 29 affinity is now unmanaged
Nov 23 15:01:10 np0005532763 dracut[1299]: *** Including module: kernel-modules-extra ***
Nov 23 15:01:10 np0005532763 dracut[1299]: *** Including module: qemu ***
Nov 23 15:01:10 np0005532763 dracut[1299]: *** Including module: fstab-sys ***
Nov 23 15:01:10 np0005532763 dracut[1299]: *** Including module: rootfs-block ***
Nov 23 15:01:10 np0005532763 dracut[1299]: *** Including module: terminfo ***
Nov 23 15:01:11 np0005532763 dracut[1299]: *** Including module: udev-rules ***
Nov 23 15:01:11 np0005532763 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 23 15:01:11 np0005532763 dracut[1299]: Skipping udev rule: 91-permissions.rules
Nov 23 15:01:11 np0005532763 dracut[1299]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 23 15:01:11 np0005532763 dracut[1299]: *** Including module: virtiofs ***
Nov 23 15:01:11 np0005532763 dracut[1299]: *** Including module: dracut-systemd ***
Nov 23 15:01:12 np0005532763 dracut[1299]: *** Including module: usrmount ***
Nov 23 15:01:12 np0005532763 dracut[1299]: *** Including module: base ***
Nov 23 15:01:12 np0005532763 dracut[1299]: *** Including module: fs-lib ***
Nov 23 15:01:12 np0005532763 dracut[1299]: *** Including module: kdumpbase ***
Nov 23 15:01:13 np0005532763 dracut[1299]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 23 15:01:13 np0005532763 dracut[1299]:  microcode_ctl module: mangling fw_dir
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: configuration "intel" is ignored
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 23 15:01:13 np0005532763 dracut[1299]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 23 15:01:13 np0005532763 dracut[1299]: *** Including module: openssl ***
Nov 23 15:01:13 np0005532763 dracut[1299]: *** Including module: shutdown ***
Nov 23 15:01:13 np0005532763 dracut[1299]: *** Including module: squash ***
Nov 23 15:01:14 np0005532763 dracut[1299]: *** Including modules done ***
Nov 23 15:01:14 np0005532763 dracut[1299]: *** Installing kernel module dependencies ***
Nov 23 15:01:15 np0005532763 dracut[1299]: *** Installing kernel module dependencies done ***
Nov 23 15:01:15 np0005532763 dracut[1299]: *** Resolving executable dependencies ***
Nov 23 15:01:18 np0005532763 dracut[1299]: *** Resolving executable dependencies done ***
Nov 23 15:01:18 np0005532763 dracut[1299]: *** Generating early-microcode cpio image ***
Nov 23 15:01:18 np0005532763 dracut[1299]: *** Store current command line parameters ***
Nov 23 15:01:18 np0005532763 dracut[1299]: Stored kernel commandline:
Nov 23 15:01:18 np0005532763 dracut[1299]: No dracut internal kernel commandline stored in the initramfs
Nov 23 15:01:18 np0005532763 dracut[1299]: *** Install squash loader ***
Nov 23 15:01:20 np0005532763 dracut[1299]: *** Squashing the files inside the initramfs ***
Nov 23 15:01:21 np0005532763 dracut[1299]: *** Squashing the files inside the initramfs done ***
Nov 23 15:01:21 np0005532763 dracut[1299]: *** Creating image file '/boot/initramfs-5.14.0-639.el9.x86_64kdump.img' ***
Nov 23 15:01:21 np0005532763 dracut[1299]: *** Hardlinking files ***
Nov 23 15:01:21 np0005532763 dracut[1299]: *** Hardlinking files done ***
Nov 23 15:01:23 np0005532763 dracut[1299]: *** Creating initramfs image file '/boot/initramfs-5.14.0-639.el9.x86_64kdump.img' done ***
Nov 23 15:01:24 np0005532763 kdumpctl[1025]: kdump: kexec: loaded kdump kernel
Nov 23 15:01:24 np0005532763 kdumpctl[1025]: kdump: Starting kdump: [OK]
Nov 23 15:01:24 np0005532763 systemd[1]: Finished Crash recovery kernel arming.
Nov 23 15:01:24 np0005532763 systemd[1]: Startup finished in 1.907s (kernel) + 4.803s (initrd) + 27.598s (userspace) = 34.308s.
Nov 23 15:01:31 np0005532763 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 23 15:03:08 np0005532763 systemd[1]: Created slice User Slice of UID 1000.
Nov 23 15:03:08 np0005532763 systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 23 15:03:08 np0005532763 systemd-logind[830]: New session 1 of user zuul.
Nov 23 15:03:09 np0005532763 systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 23 15:03:09 np0005532763 systemd[1]: Starting User Manager for UID 1000...
Nov 23 15:03:09 np0005532763 systemd[4308]: Queued start job for default target Main User Target.
Nov 23 15:03:09 np0005532763 systemd[4308]: Created slice User Application Slice.
Nov 23 15:03:09 np0005532763 systemd[4308]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 23 15:03:09 np0005532763 systemd[4308]: Started Daily Cleanup of User's Temporary Directories.
Nov 23 15:03:09 np0005532763 systemd[4308]: Reached target Paths.
Nov 23 15:03:09 np0005532763 systemd[4308]: Reached target Timers.
Nov 23 15:03:09 np0005532763 systemd[4308]: Starting D-Bus User Message Bus Socket...
Nov 23 15:03:09 np0005532763 systemd[4308]: Starting Create User's Volatile Files and Directories...
Nov 23 15:03:09 np0005532763 systemd[4308]: Listening on D-Bus User Message Bus Socket.
Nov 23 15:03:09 np0005532763 systemd[4308]: Reached target Sockets.
Nov 23 15:03:09 np0005532763 systemd[4308]: Finished Create User's Volatile Files and Directories.
Nov 23 15:03:09 np0005532763 systemd[4308]: Reached target Basic System.
Nov 23 15:03:09 np0005532763 systemd[4308]: Reached target Main User Target.
Nov 23 15:03:09 np0005532763 systemd[4308]: Startup finished in 177ms.
Nov 23 15:03:09 np0005532763 systemd[1]: Started User Manager for UID 1000.
Nov 23 15:03:09 np0005532763 systemd[1]: Started Session 1 of User zuul.
Nov 23 15:03:09 np0005532763 python3[4390]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:03:15 np0005532763 python3[4418]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:03:21 np0005532763 python3[4476]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:03:22 np0005532763 python3[4516]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 23 15:03:24 np0005532763 python3[4542]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCtaVH+Hfp24GC/nLOCl87TIJDf22iIpXaDmkip6hyFZ60lyVpfYxFl6Z4FqAbKci+Ock4NHD78xcKBN+nqpMJyIdLDl6IlqwxWyUc/lX5/TIm6PknK9ykLQzLzQZzRt1Mk1hK89Am3bbY9TVh2ZdujVyOmjWLVqA/0FhkvYKJWaid0pgs6EdTygKGzSfc7V7Zm4ijA+aHyny1AE6h4zzdGP/d6AL8fjaGD/LpcU6DnbbD9WHzrmCJXOyJa5/Ky5sttSY3WpH33eL7o554W1og4Dq5c+z/Pc0NlJT1DXPpxrtrLpJ57vb04Ae1Wg5PeG+MECxQWJRQBS51hNbLb4KTkDErpMaWbfcwdnzisQHazTgjNidmG34/j4ZvJ/NP2OkEBabHukyMvOCFw3Ew9lQ5eR2EiNjFtdvI12kRiXyyk9Ti3dsncy9kfInD5nPUeVGnxbIGdwP/T5Z2crXhgdrIWCRjRMvV/756tjKFXfzl/eIzO6UcLkU2I9qdqZpL0h8U= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:24 np0005532763 python3[4566]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:03:25 np0005532763 python3[4665]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:03:25 np0005532763 python3[4736]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763928205.0442128-254-171083343463087/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=b927b3f7e94443b59884cfdc0421ba80_id_rsa follow=False checksum=b8b11f458d3dcaed5d0ce620e052c77faf8a3312 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:03:26 np0005532763 python3[4859]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:03:26 np0005532763 python3[4930]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763928206.1086512-308-238677000556798/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=b927b3f7e94443b59884cfdc0421ba80_id_rsa.pub follow=False checksum=c143f6be1d4420dad576f5c3c6738e84bfb79a9b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:03:28 np0005532763 python3[4978]: ansible-ping Invoked with data=pong
Nov 23 15:03:29 np0005532763 python3[5002]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:03:30 np0005532763 irqbalance[825]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 23 15:03:30 np0005532763 irqbalance[825]: IRQ 26 affinity is now unmanaged
Nov 23 15:03:31 np0005532763 python3[5060]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 23 15:03:32 np0005532763 python3[5092]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:03:32 np0005532763 python3[5116]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:03:33 np0005532763 python3[5140]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:03:34 np0005532763 python3[5164]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:03:34 np0005532763 python3[5188]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:03:34 np0005532763 python3[5212]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:03:37 np0005532763 python3[5238]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:03:37 np0005532763 python3[5316]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:03:38 np0005532763 python3[5389]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763928217.3289793-35-26572380181002/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:03:39 np0005532763 python3[5439]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:39 np0005532763 python3[5463]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:39 np0005532763 python3[5487]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:39 np0005532763 python3[5511]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:40 np0005532763 python3[5535]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:40 np0005532763 python3[5559]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:40 np0005532763 python3[5583]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:41 np0005532763 python3[5607]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:41 np0005532763 python3[5631]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:41 np0005532763 python3[5655]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:41 np0005532763 python3[5679]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:42 np0005532763 python3[5703]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:42 np0005532763 python3[5727]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:42 np0005532763 python3[5751]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:43 np0005532763 python3[5775]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:43 np0005532763 python3[5799]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:43 np0005532763 python3[5823]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:44 np0005532763 python3[5847]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:44 np0005532763 python3[5871]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:44 np0005532763 python3[5895]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:44 np0005532763 python3[5919]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:45 np0005532763 python3[5943]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:45 np0005532763 python3[5967]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:45 np0005532763 python3[5991]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:46 np0005532763 python3[6015]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:46 np0005532763 python3[6039]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:03:48 np0005532763 python3[6065]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 23 15:03:48 np0005532763 systemd[1]: Starting Time & Date Service...
Nov 23 15:03:48 np0005532763 systemd[1]: Started Time & Date Service.
Nov 23 15:03:48 np0005532763 systemd-timedated[6067]: Changed time zone to 'UTC' (UTC).
Nov 23 15:03:49 np0005532763 python3[6096]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:03:49 np0005532763 python3[6172]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:03:50 np0005532763 python3[6243]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1763928229.492532-254-150799012724331/source _original_basename=tmp_et3svwx follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:03:50 np0005532763 python3[6343]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:03:51 np0005532763 python3[6414]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763928230.4269097-304-89836612959351/source _original_basename=tmpjvgnufwv follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:03:52 np0005532763 python3[6516]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:03:52 np0005532763 python3[6589]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763928231.7178698-384-263606479803876/source _original_basename=tmpqkykodlo follow=False checksum=634f92f67c90daca0d0661ff9e082945cbba2c1b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:03:53 np0005532763 python3[6637]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:03:53 np0005532763 python3[6663]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:03:53 np0005532763 python3[6743]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:03:54 np0005532763 python3[6816]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1763928233.509296-454-218753361205072/source _original_basename=tmpcr9z3mhp follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:03:54 np0005532763 python3[6867]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-4746-eccf-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:03:55 np0005532763 python3[6895]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-4746-eccf-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 23 15:03:56 np0005532763 python3[6923]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:04:16 np0005532763 python3[6949]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:04:18 np0005532763 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 23 15:05:16 np0005532763 systemd-logind[830]: Session 1 logged out. Waiting for processes to exit.
Nov 23 15:05:27 np0005532763 systemd[4308]: Starting Mark boot as successful...
Nov 23 15:05:27 np0005532763 systemd[4308]: Finished Mark boot as successful.
Nov 23 15:05:47 np0005532763 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 23 15:05:47 np0005532763 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 23 15:05:47 np0005532763 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 23 15:05:47 np0005532763 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 23 15:05:47 np0005532763 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 23 15:05:47 np0005532763 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 23 15:05:47 np0005532763 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 23 15:05:47 np0005532763 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 23 15:05:47 np0005532763 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 23 15:05:47 np0005532763 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 23 15:05:47 np0005532763 NetworkManager[865]: <info>  [1763928347.8237] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 23 15:05:47 np0005532763 systemd-udevd[6955]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 15:05:47 np0005532763 NetworkManager[865]: <info>  [1763928347.8445] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 15:05:47 np0005532763 NetworkManager[865]: <info>  [1763928347.8490] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 23 15:05:47 np0005532763 NetworkManager[865]: <info>  [1763928347.8497] device (eth1): carrier: link connected
Nov 23 15:05:47 np0005532763 NetworkManager[865]: <info>  [1763928347.8501] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 23 15:05:47 np0005532763 NetworkManager[865]: <info>  [1763928347.8512] policy: auto-activating connection 'Wired connection 1' (19fc6877-689b-3c5c-bc86-f9bfe6b22958)
Nov 23 15:05:47 np0005532763 NetworkManager[865]: <info>  [1763928347.8518] device (eth1): Activation: starting connection 'Wired connection 1' (19fc6877-689b-3c5c-bc86-f9bfe6b22958)
Nov 23 15:05:47 np0005532763 NetworkManager[865]: <info>  [1763928347.8520] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 23 15:05:47 np0005532763 NetworkManager[865]: <info>  [1763928347.8524] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 23 15:05:47 np0005532763 NetworkManager[865]: <info>  [1763928347.8531] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 23 15:05:47 np0005532763 NetworkManager[865]: <info>  [1763928347.8538] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 23 15:05:48 np0005532763 systemd-logind[830]: New session 3 of user zuul.
Nov 23 15:05:48 np0005532763 systemd[1]: Started Session 3 of User zuul.
Nov 23 15:05:49 np0005532763 python3[6986]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-f412-6632-0000000001f6-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:05:59 np0005532763 python3[7066]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:05:59 np0005532763 python3[7139]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763928358.8901465-206-258194757918840/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=1b6ae75c590d16a16df2dde36f385ce69c541849 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:06:00 np0005532763 python3[7189]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:06:00 np0005532763 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 23 15:06:00 np0005532763 systemd[1]: Stopped Network Manager Wait Online.
Nov 23 15:06:00 np0005532763 systemd[1]: Stopping Network Manager Wait Online...
Nov 23 15:06:00 np0005532763 systemd[1]: Stopping Network Manager...
Nov 23 15:06:00 np0005532763 NetworkManager[865]: <info>  [1763928360.2856] caught SIGTERM, shutting down normally.
Nov 23 15:06:00 np0005532763 NetworkManager[865]: <info>  [1763928360.2873] dhcp4 (eth0): canceled DHCP transaction
Nov 23 15:06:00 np0005532763 NetworkManager[865]: <info>  [1763928360.2875] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 23 15:06:00 np0005532763 NetworkManager[865]: <info>  [1763928360.2876] dhcp4 (eth0): state changed no lease
Nov 23 15:06:00 np0005532763 NetworkManager[865]: <info>  [1763928360.2881] manager: NetworkManager state is now CONNECTING
Nov 23 15:06:00 np0005532763 NetworkManager[865]: <info>  [1763928360.3033] dhcp4 (eth1): canceled DHCP transaction
Nov 23 15:06:00 np0005532763 NetworkManager[865]: <info>  [1763928360.3033] dhcp4 (eth1): state changed no lease
Nov 23 15:06:00 np0005532763 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 23 15:06:00 np0005532763 NetworkManager[865]: <info>  [1763928360.3112] exiting (success)
Nov 23 15:06:00 np0005532763 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 23 15:06:00 np0005532763 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 23 15:06:00 np0005532763 systemd[1]: Stopped Network Manager.
Nov 23 15:06:00 np0005532763 systemd[1]: NetworkManager.service: Consumed 2.007s CPU time, 10.0M memory peak.
Nov 23 15:06:00 np0005532763 systemd[1]: Starting Network Manager...
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.3944] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:a896f7ce-22ce-43ea-a9ba-7e128facc2bb)
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.3945] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.4033] manager[0x560477cd1070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 23 15:06:00 np0005532763 systemd[1]: Starting Hostname Service...
Nov 23 15:06:00 np0005532763 systemd[1]: Started Hostname Service.
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5151] hostname: hostname: using hostnamed
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5152] hostname: static hostname changed from (none) to "np0005532763.novalocal"
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5159] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5166] manager[0x560477cd1070]: rfkill: Wi-Fi hardware radio set enabled
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5166] manager[0x560477cd1070]: rfkill: WWAN hardware radio set enabled
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5210] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5211] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5211] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5213] manager: Networking is enabled by state file
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5216] settings: Loaded settings plugin: keyfile (internal)
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5222] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5264] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5281] dhcp: init: Using DHCP client 'internal'
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5287] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5297] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5309] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5324] device (lo): Activation: starting connection 'lo' (ac07c606-0f80-4608-b8dd-99a45b6a547d)
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5335] device (eth0): carrier: link connected
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5343] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5350] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5351] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5361] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5371] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5381] device (eth1): carrier: link connected
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5388] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5396] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (19fc6877-689b-3c5c-bc86-f9bfe6b22958) (indicated)
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5396] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5405] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5415] device (eth1): Activation: starting connection 'Wired connection 1' (19fc6877-689b-3c5c-bc86-f9bfe6b22958)
Nov 23 15:06:00 np0005532763 systemd[1]: Started Network Manager.
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5425] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5432] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5436] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5439] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5444] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5449] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5453] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5458] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5464] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5476] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5481] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5503] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5509] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5533] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5540] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5547] device (lo): Activation: successful, device activated.
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5556] dhcp4 (eth0): state changed new lease, address=38.102.83.111
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5566] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 23 15:06:00 np0005532763 systemd[1]: Starting Network Manager Wait Online...
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5643] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5682] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5685] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5691] manager: NetworkManager state is now CONNECTED_SITE
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5698] device (eth0): Activation: successful, device activated.
Nov 23 15:06:00 np0005532763 NetworkManager[7199]: <info>  [1763928360.5707] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 23 15:06:00 np0005532763 python3[7273]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-f412-6632-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:06:10 np0005532763 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 23 15:06:30 np0005532763 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4088] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 23 15:06:45 np0005532763 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 23 15:06:45 np0005532763 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4452] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4462] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4483] device (eth1): Activation: successful, device activated.
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4497] manager: startup complete
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4501] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <warn>  [1763928405.4519] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4536] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 23 15:06:45 np0005532763 systemd[1]: Finished Network Manager Wait Online.
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4723] dhcp4 (eth1): canceled DHCP transaction
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4724] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4724] dhcp4 (eth1): state changed no lease
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4745] policy: auto-activating connection 'ci-private-network' (24d1ddd0-0657-595a-9a2d-45d9b7719a8e)
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4752] device (eth1): Activation: starting connection 'ci-private-network' (24d1ddd0-0657-595a-9a2d-45d9b7719a8e)
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4754] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4759] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4768] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4777] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4821] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4824] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 23 15:06:45 np0005532763 NetworkManager[7199]: <info>  [1763928405.4832] device (eth1): Activation: successful, device activated.
Nov 23 15:06:55 np0005532763 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 23 15:07:01 np0005532763 systemd[1]: session-3.scope: Deactivated successfully.
Nov 23 15:07:01 np0005532763 systemd[1]: session-3.scope: Consumed 1.870s CPU time.
Nov 23 15:07:01 np0005532763 systemd-logind[830]: Session 3 logged out. Waiting for processes to exit.
Nov 23 15:07:01 np0005532763 systemd-logind[830]: Removed session 3.
Nov 23 15:07:11 np0005532763 systemd-logind[830]: New session 4 of user zuul.
Nov 23 15:07:11 np0005532763 systemd[1]: Started Session 4 of User zuul.
Nov 23 15:07:12 np0005532763 python3[7386]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:07:12 np0005532763 python3[7460]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763928432.122757-373-26114690903796/source _original_basename=tmp04fynuhy follow=False checksum=3134bd1d03fba929119b03a893a690ab48d9a2ea backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:07:14 np0005532763 systemd[1]: session-4.scope: Deactivated successfully.
Nov 23 15:07:14 np0005532763 systemd-logind[830]: Session 4 logged out. Waiting for processes to exit.
Nov 23 15:07:14 np0005532763 systemd-logind[830]: Removed session 4.
Nov 23 15:08:27 np0005532763 systemd[4308]: Created slice User Background Tasks Slice.
Nov 23 15:08:27 np0005532763 systemd[4308]: Starting Cleanup of User's Temporary Files and Directories...
Nov 23 15:08:27 np0005532763 systemd[4308]: Finished Cleanup of User's Temporary Files and Directories.
Nov 23 15:12:18 np0005532763 systemd-logind[830]: New session 5 of user zuul.
Nov 23 15:12:18 np0005532763 systemd[1]: Started Session 5 of User zuul.
Nov 23 15:12:19 np0005532763 python3[7526]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-bee1-1da1-000000001cd8-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:12:19 np0005532763 python3[7554]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:12:19 np0005532763 python3[7581]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:12:20 np0005532763 python3[7607]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:12:20 np0005532763 python3[7633]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:12:20 np0005532763 python3[7659]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:12:21 np0005532763 python3[7737]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:12:21 np0005532763 python3[7810]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763928741.0979905-513-66888474415613/source _original_basename=tmp09us2ye9 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:12:22 np0005532763 python3[7860]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 15:12:22 np0005532763 systemd[1]: Reloading.
Nov 23 15:12:22 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:12:22 np0005532763 systemd[1]: Starting dnf makecache...
Nov 23 15:12:23 np0005532763 dnf[7892]: Failed determining last makecache time.
Nov 23 15:12:23 np0005532763 dnf[7892]: CentOS Stream 9 - BaseOS                         53 kB/s | 7.3 kB     00:00
Nov 23 15:12:23 np0005532763 dnf[7892]: CentOS Stream 9 - AppStream                      81 kB/s | 7.4 kB     00:00
Nov 23 15:12:24 np0005532763 dnf[7892]: CentOS Stream 9 - CRB                            69 kB/s | 7.2 kB     00:00
Nov 23 15:12:24 np0005532763 python3[7925]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 23 15:12:24 np0005532763 dnf[7892]: CentOS Stream 9 - Extras packages                24 kB/s | 8.3 kB     00:00
Nov 23 15:12:24 np0005532763 dnf[7892]: Metadata cache created.
Nov 23 15:12:24 np0005532763 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 23 15:12:24 np0005532763 systemd[1]: Finished dnf makecache.
Nov 23 15:12:24 np0005532763 python3[7951]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:12:25 np0005532763 python3[7979]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:12:25 np0005532763 python3[8007]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:12:25 np0005532763 python3[8035]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:12:26 np0005532763 python3[8062]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-bee1-1da1-000000001cdf-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:12:26 np0005532763 python3[8092]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 15:12:29 np0005532763 systemd[1]: session-5.scope: Deactivated successfully.
Nov 23 15:12:29 np0005532763 systemd[1]: session-5.scope: Consumed 5.017s CPU time.
Nov 23 15:12:29 np0005532763 systemd-logind[830]: Session 5 logged out. Waiting for processes to exit.
Nov 23 15:12:29 np0005532763 systemd-logind[830]: Removed session 5.
Nov 23 15:12:31 np0005532763 systemd-logind[830]: New session 6 of user zuul.
Nov 23 15:12:31 np0005532763 systemd[1]: Started Session 6 of User zuul.
Nov 23 15:12:31 np0005532763 python3[8125]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 15:12:45 np0005532763 kernel: SELinux:  Converting 385 SID table entries...
Nov 23 15:12:45 np0005532763 kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 15:12:45 np0005532763 kernel: SELinux:  policy capability open_perms=1
Nov 23 15:12:45 np0005532763 kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 15:12:45 np0005532763 kernel: SELinux:  policy capability always_check_network=0
Nov 23 15:12:45 np0005532763 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 15:12:45 np0005532763 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 15:12:45 np0005532763 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 15:12:53 np0005532763 kernel: SELinux:  Converting 385 SID table entries...
Nov 23 15:12:53 np0005532763 kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 15:12:53 np0005532763 kernel: SELinux:  policy capability open_perms=1
Nov 23 15:12:53 np0005532763 kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 15:12:53 np0005532763 kernel: SELinux:  policy capability always_check_network=0
Nov 23 15:12:53 np0005532763 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 15:12:53 np0005532763 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 15:12:53 np0005532763 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 15:13:05 np0005532763 kernel: SELinux:  Converting 385 SID table entries...
Nov 23 15:13:05 np0005532763 kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 15:13:05 np0005532763 kernel: SELinux:  policy capability open_perms=1
Nov 23 15:13:05 np0005532763 kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 15:13:05 np0005532763 kernel: SELinux:  policy capability always_check_network=0
Nov 23 15:13:05 np0005532763 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 15:13:05 np0005532763 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 15:13:05 np0005532763 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 15:13:06 np0005532763 setsebool[8184]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 23 15:13:06 np0005532763 setsebool[8184]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 23 15:13:18 np0005532763 kernel: SELinux:  Converting 388 SID table entries...
Nov 23 15:13:18 np0005532763 kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 15:13:18 np0005532763 kernel: SELinux:  policy capability open_perms=1
Nov 23 15:13:18 np0005532763 kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 15:13:18 np0005532763 kernel: SELinux:  policy capability always_check_network=0
Nov 23 15:13:18 np0005532763 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 15:13:18 np0005532763 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 15:13:18 np0005532763 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 15:13:36 np0005532763 dbus-broker-launch[812]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 23 15:13:36 np0005532763 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 15:13:36 np0005532763 systemd[1]: Starting man-db-cache-update.service...
Nov 23 15:13:36 np0005532763 systemd[1]: Reloading.
Nov 23 15:13:36 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:13:36 np0005532763 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 15:13:40 np0005532763 irqbalance[825]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 23 15:13:40 np0005532763 irqbalance[825]: IRQ 27 affinity is now unmanaged
Nov 23 15:13:46 np0005532763 python3[14290]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-ba7b-575b-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:13:47 np0005532763 kernel: evm: overlay not supported
Nov 23 15:13:47 np0005532763 systemd[4308]: Starting D-Bus User Message Bus...
Nov 23 15:13:47 np0005532763 dbus-broker-launch[14696]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 23 15:13:47 np0005532763 dbus-broker-launch[14696]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 23 15:13:47 np0005532763 systemd[4308]: Started D-Bus User Message Bus.
Nov 23 15:13:47 np0005532763 dbus-broker-lau[14696]: Ready
Nov 23 15:13:47 np0005532763 systemd[4308]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 23 15:13:47 np0005532763 systemd[4308]: Created slice Slice /user.
Nov 23 15:13:47 np0005532763 systemd[4308]: podman-14636.scope: unit configures an IP firewall, but not running as root.
Nov 23 15:13:47 np0005532763 systemd[4308]: (This warning is only shown for the first unit using IP firewalling.)
Nov 23 15:13:47 np0005532763 systemd[4308]: Started podman-14636.scope.
Nov 23 15:13:47 np0005532763 systemd[4308]: Started podman-pause-5f32df79.scope.
Nov 23 15:13:47 np0005532763 systemd[1]: session-6.scope: Deactivated successfully.
Nov 23 15:13:47 np0005532763 systemd[1]: session-6.scope: Consumed 1min 2.257s CPU time.
Nov 23 15:13:47 np0005532763 systemd-logind[830]: Session 6 logged out. Waiting for processes to exit.
Nov 23 15:13:47 np0005532763 systemd-logind[830]: Removed session 6.
Nov 23 15:14:07 np0005532763 systemd-logind[830]: New session 7 of user zuul.
Nov 23 15:14:07 np0005532763 systemd[1]: Started Session 7 of User zuul.
Nov 23 15:14:08 np0005532763 python3[21247]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBA87KGYjjoyogEDAuKEHrB6Oxv3mIvu13bhzDbjQjrNyl3D2q3szz508Yk2UHZaBKDHJbLxThWYWGwZpHtr+UTo= zuul@np0005532760.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:14:08 np0005532763 python3[21398]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBA87KGYjjoyogEDAuKEHrB6Oxv3mIvu13bhzDbjQjrNyl3D2q3szz508Yk2UHZaBKDHJbLxThWYWGwZpHtr+UTo= zuul@np0005532760.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:14:09 np0005532763 python3[21686]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005532763.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 23 15:14:10 np0005532763 python3[21873]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBA87KGYjjoyogEDAuKEHrB6Oxv3mIvu13bhzDbjQjrNyl3D2q3szz508Yk2UHZaBKDHJbLxThWYWGwZpHtr+UTo= zuul@np0005532760.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 23 15:14:10 np0005532763 python3[22101]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:14:10 np0005532763 python3[22294]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763928850.2495265-153-127641313294555/source _original_basename=tmp8d68qn3w follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:14:11 np0005532763 python3[22590]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Nov 23 15:14:11 np0005532763 systemd[1]: Starting Hostname Service...
Nov 23 15:14:11 np0005532763 systemd[1]: Started Hostname Service.
Nov 23 15:14:12 np0005532763 systemd-hostnamed[22686]: Changed pretty hostname to 'compute-2'
Nov 23 15:14:12 np0005532763 systemd-hostnamed[22686]: Hostname set to <compute-2> (static)
Nov 23 15:14:12 np0005532763 NetworkManager[7199]: <info>  [1763928852.0123] hostname: static hostname changed from "np0005532763.novalocal" to "compute-2"
Nov 23 15:14:12 np0005532763 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 23 15:14:12 np0005532763 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 23 15:14:12 np0005532763 systemd[1]: session-7.scope: Deactivated successfully.
Nov 23 15:14:12 np0005532763 systemd[1]: session-7.scope: Consumed 2.765s CPU time.
Nov 23 15:14:12 np0005532763 systemd-logind[830]: Session 7 logged out. Waiting for processes to exit.
Nov 23 15:14:12 np0005532763 systemd-logind[830]: Removed session 7.
Nov 23 15:14:22 np0005532763 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 23 15:14:37 np0005532763 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 15:14:37 np0005532763 systemd[1]: Finished man-db-cache-update.service.
Nov 23 15:14:37 np0005532763 systemd[1]: man-db-cache-update.service: Consumed 1min 14.040s CPU time.
Nov 23 15:14:37 np0005532763 systemd[1]: run-r88c7a2a254e74b119b98467dee60a1ec.service: Deactivated successfully.
Nov 23 15:14:42 np0005532763 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 23 15:15:54 np0005532763 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 23 15:15:54 np0005532763 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 23 15:15:54 np0005532763 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 23 15:15:54 np0005532763 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 23 15:18:11 np0005532763 systemd-logind[830]: New session 8 of user zuul.
Nov 23 15:18:12 np0005532763 systemd[1]: Started Session 8 of User zuul.
Nov 23 15:18:12 np0005532763 python3[29999]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:18:14 np0005532763 python3[30115]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:18:15 np0005532763 python3[30188]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763929094.313926-33978-281355781052509/source mode=0755 _original_basename=delorean.repo follow=False checksum=1830be8248976a7f714fb01ca8550e92dfc79ad2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:18:15 np0005532763 python3[30214]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:18:15 np0005532763 python3[30287]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763929094.313926-33978-281355781052509/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:18:16 np0005532763 python3[30313]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:18:16 np0005532763 python3[30386]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763929094.313926-33978-281355781052509/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:18:16 np0005532763 python3[30412]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:18:17 np0005532763 python3[30485]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763929094.313926-33978-281355781052509/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:18:17 np0005532763 python3[30511]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:18:17 np0005532763 python3[30584]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763929094.313926-33978-281355781052509/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:18:18 np0005532763 python3[30610]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:18:18 np0005532763 python3[30683]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763929094.313926-33978-281355781052509/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:18:18 np0005532763 python3[30709]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:18:19 np0005532763 python3[30782]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763929094.313926-33978-281355781052509/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6646317362318a9831d66a1804f6bb7dd1b97cd5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:18:31 np0005532763 python3[30830]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:23:31 np0005532763 systemd[1]: session-8.scope: Deactivated successfully.
Nov 23 15:23:31 np0005532763 systemd[1]: session-8.scope: Consumed 6.105s CPU time.
Nov 23 15:23:31 np0005532763 systemd-logind[830]: Session 8 logged out. Waiting for processes to exit.
Nov 23 15:23:31 np0005532763 systemd-logind[830]: Removed session 8.
Nov 23 15:29:41 np0005532763 systemd-logind[830]: New session 9 of user zuul.
Nov 23 15:29:41 np0005532763 systemd[1]: Started Session 9 of User zuul.
Nov 23 15:29:42 np0005532763 python3.9[31004]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:29:43 np0005532763 python3.9[31185]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:29:51 np0005532763 systemd[1]: session-9.scope: Deactivated successfully.
Nov 23 15:29:51 np0005532763 systemd[1]: session-9.scope: Consumed 8.576s CPU time.
Nov 23 15:29:51 np0005532763 systemd-logind[830]: Session 9 logged out. Waiting for processes to exit.
Nov 23 15:29:51 np0005532763 systemd-logind[830]: Removed session 9.
Nov 23 15:30:06 np0005532763 systemd-logind[830]: New session 10 of user zuul.
Nov 23 15:30:06 np0005532763 systemd[1]: Started Session 10 of User zuul.
Nov 23 15:30:07 np0005532763 python3.9[31397]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 23 15:30:09 np0005532763 python3.9[31571]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:30:10 np0005532763 python3.9[31723]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:30:11 np0005532763 python3.9[31876]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:30:12 np0005532763 python3.9[32028]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:30:13 np0005532763 python3.9[32180]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:30:14 np0005532763 python3.9[32303]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763929812.9111185-179-243443121675023/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:30:15 np0005532763 python3.9[32455]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:30:16 np0005532763 python3.9[32611]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:30:17 np0005532763 python3.9[32763]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:30:17 np0005532763 python3.9[32913]: ansible-ansible.builtin.service_facts Invoked
Nov 23 15:30:23 np0005532763 python3.9[33166]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:30:24 np0005532763 python3.9[33316]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:30:25 np0005532763 python3.9[33470]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:30:26 np0005532763 python3.9[33628]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:30:27 np0005532763 python3.9[33712]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:31:09 np0005532763 systemd[1]: Reloading.
Nov 23 15:31:09 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:31:09 np0005532763 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 23 15:31:10 np0005532763 systemd[1]: Reloading.
Nov 23 15:31:10 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:31:10 np0005532763 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 23 15:31:10 np0005532763 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 23 15:31:10 np0005532763 systemd[1]: Reloading.
Nov 23 15:31:10 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:31:10 np0005532763 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 23 15:31:11 np0005532763 dbus-broker-launch[794]: Noticed file-system modification, trigger reload.
Nov 23 15:31:11 np0005532763 dbus-broker-launch[794]: Noticed file-system modification, trigger reload.
Nov 23 15:31:11 np0005532763 dbus-broker-launch[794]: Noticed file-system modification, trigger reload.
Nov 23 15:32:13 np0005532763 kernel: SELinux:  Converting 2716 SID table entries...
Nov 23 15:32:13 np0005532763 kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 15:32:13 np0005532763 kernel: SELinux:  policy capability open_perms=1
Nov 23 15:32:13 np0005532763 kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 15:32:13 np0005532763 kernel: SELinux:  policy capability always_check_network=0
Nov 23 15:32:13 np0005532763 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 15:32:13 np0005532763 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 15:32:13 np0005532763 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 15:32:14 np0005532763 dbus-broker-launch[812]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 23 15:32:14 np0005532763 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 15:32:14 np0005532763 systemd[1]: Starting man-db-cache-update.service...
Nov 23 15:32:14 np0005532763 systemd[1]: Reloading.
Nov 23 15:32:14 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:32:14 np0005532763 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 15:32:15 np0005532763 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 15:32:15 np0005532763 systemd[1]: Finished man-db-cache-update.service.
Nov 23 15:32:15 np0005532763 systemd[1]: man-db-cache-update.service: Consumed 1.609s CPU time.
Nov 23 15:32:15 np0005532763 systemd[1]: run-r451d60ffce8c4698971e7df2d61dfdbb.service: Deactivated successfully.
Nov 23 15:32:29 np0005532763 python3.9[35235]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:32:31 np0005532763 python3.9[35516]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 23 15:32:32 np0005532763 python3.9[35668]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 23 15:32:34 np0005532763 python3.9[35821]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:32:38 np0005532763 python3.9[35973]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 23 15:32:41 np0005532763 python3.9[36125]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:32:47 np0005532763 python3.9[36277]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:32:47 np0005532763 python3.9[36400]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763929966.5273442-669-239440641860438/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=848940549ac5db80ec615963c7c09743939a62fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:32:49 np0005532763 python3.9[36552]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:32:49 np0005532763 python3.9[36704]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:32:50 np0005532763 python3.9[36857]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:32:52 np0005532763 python3.9[37009]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 23 15:32:53 np0005532763 python3.9[37162]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 23 15:32:53 np0005532763 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 15:32:53 np0005532763 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 15:32:54 np0005532763 python3.9[37321]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 23 15:32:55 np0005532763 python3.9[37481]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 23 15:32:55 np0005532763 python3.9[37634]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 23 15:32:56 np0005532763 python3.9[37792]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 23 15:32:58 np0005532763 python3.9[37944]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:33:00 np0005532763 python3.9[38097]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:33:01 np0005532763 python3.9[38249]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:33:02 np0005532763 python3.9[38372]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763929980.8707528-1026-86144845218805/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:33:03 np0005532763 python3.9[38524]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:33:03 np0005532763 systemd[1]: Starting Load Kernel Modules...
Nov 23 15:33:03 np0005532763 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 23 15:33:03 np0005532763 kernel: Bridge firewalling registered
Nov 23 15:33:03 np0005532763 systemd-modules-load[38528]: Inserted module 'br_netfilter'
Nov 23 15:33:03 np0005532763 systemd[1]: Finished Load Kernel Modules.
Nov 23 15:33:04 np0005532763 python3.9[38683]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:33:05 np0005532763 python3.9[38806]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763929983.6953745-1096-84047026251852/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:33:06 np0005532763 python3.9[38958]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:33:09 np0005532763 dbus-broker-launch[794]: Noticed file-system modification, trigger reload.
Nov 23 15:33:09 np0005532763 dbus-broker-launch[794]: Noticed file-system modification, trigger reload.
Nov 23 15:33:10 np0005532763 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 15:33:10 np0005532763 systemd[1]: Starting man-db-cache-update.service...
Nov 23 15:33:10 np0005532763 systemd[1]: Reloading.
Nov 23 15:33:10 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:33:10 np0005532763 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 15:33:11 np0005532763 python3.9[40239]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:33:12 np0005532763 python3.9[40926]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 23 15:33:13 np0005532763 python3.9[41588]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:33:14 np0005532763 python3.9[42421]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:33:14 np0005532763 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 23 15:33:15 np0005532763 systemd[1]: Starting Authorization Manager...
Nov 23 15:33:15 np0005532763 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 23 15:33:15 np0005532763 polkitd[43157]: Started polkitd version 0.117
Nov 23 15:33:15 np0005532763 systemd[1]: Started Authorization Manager.
Nov 23 15:33:15 np0005532763 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 15:33:15 np0005532763 systemd[1]: Finished man-db-cache-update.service.
Nov 23 15:33:15 np0005532763 systemd[1]: man-db-cache-update.service: Consumed 7.057s CPU time.
Nov 23 15:33:15 np0005532763 systemd[1]: run-r51175f4db3f541fbac4f54d4b43232a6.service: Deactivated successfully.
Nov 23 15:33:16 np0005532763 python3.9[43508]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:33:16 np0005532763 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 23 15:33:16 np0005532763 systemd[1]: tuned.service: Deactivated successfully.
Nov 23 15:33:16 np0005532763 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 23 15:33:16 np0005532763 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 23 15:33:16 np0005532763 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 23 15:33:17 np0005532763 python3.9[43669]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 23 15:33:21 np0005532763 python3.9[43821]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:33:21 np0005532763 systemd[1]: Reloading.
Nov 23 15:33:21 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:33:22 np0005532763 python3.9[44010]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:33:23 np0005532763 systemd[1]: Reloading.
Nov 23 15:33:23 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:33:24 np0005532763 python3.9[44200]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:33:25 np0005532763 python3.9[44353]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:33:25 np0005532763 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 23 15:33:26 np0005532763 python3.9[44506]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:33:28 np0005532763 python3.9[44668]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:33:29 np0005532763 python3.9[44821]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:33:29 np0005532763 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 23 15:33:29 np0005532763 systemd[1]: Stopped Apply Kernel Variables.
Nov 23 15:33:29 np0005532763 systemd[1]: Stopping Apply Kernel Variables...
Nov 23 15:33:29 np0005532763 systemd[1]: Starting Apply Kernel Variables...
Nov 23 15:33:29 np0005532763 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 23 15:33:29 np0005532763 systemd[1]: Finished Apply Kernel Variables.
Nov 23 15:33:30 np0005532763 systemd[1]: session-10.scope: Deactivated successfully.
Nov 23 15:33:30 np0005532763 systemd[1]: session-10.scope: Consumed 2min 23.091s CPU time.
Nov 23 15:33:30 np0005532763 systemd-logind[830]: Session 10 logged out. Waiting for processes to exit.
Nov 23 15:33:30 np0005532763 systemd-logind[830]: Removed session 10.
Nov 23 15:33:35 np0005532763 systemd-logind[830]: New session 11 of user zuul.
Nov 23 15:33:35 np0005532763 systemd[1]: Started Session 11 of User zuul.
Nov 23 15:33:36 np0005532763 python3.9[45004]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:33:38 np0005532763 python3.9[45160]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 23 15:33:39 np0005532763 python3.9[45313]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 23 15:33:41 np0005532763 python3.9[45471]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 23 15:33:42 np0005532763 python3.9[45631]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:33:43 np0005532763 python3.9[45716]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 23 15:33:46 np0005532763 python3.9[45880]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:33:57 np0005532763 kernel: SELinux:  Converting 2728 SID table entries...
Nov 23 15:33:57 np0005532763 kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 15:33:57 np0005532763 kernel: SELinux:  policy capability open_perms=1
Nov 23 15:33:57 np0005532763 kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 15:33:57 np0005532763 kernel: SELinux:  policy capability always_check_network=0
Nov 23 15:33:57 np0005532763 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 15:33:57 np0005532763 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 15:33:57 np0005532763 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 15:33:58 np0005532763 dbus-broker-launch[812]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 23 15:33:58 np0005532763 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 23 15:33:59 np0005532763 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 15:33:59 np0005532763 systemd[1]: Starting man-db-cache-update.service...
Nov 23 15:34:00 np0005532763 systemd[1]: Reloading.
Nov 23 15:34:00 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:34:00 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:34:00 np0005532763 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 15:34:00 np0005532763 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 15:34:00 np0005532763 systemd[1]: Finished man-db-cache-update.service.
Nov 23 15:34:00 np0005532763 systemd[1]: man-db-cache-update.service: Consumed 1.008s CPU time.
Nov 23 15:34:00 np0005532763 systemd[1]: run-rf114c8ddd70842ec93e28f050e692a6f.service: Deactivated successfully.
Nov 23 15:34:03 np0005532763 python3.9[46978]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 15:34:04 np0005532763 systemd[1]: Reloading.
Nov 23 15:34:04 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:34:04 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:34:04 np0005532763 systemd[1]: Starting Open vSwitch Database Unit...
Nov 23 15:34:04 np0005532763 chown[47020]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 23 15:34:05 np0005532763 ovs-ctl[47025]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 23 15:34:05 np0005532763 ovs-ctl[47025]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 23 15:34:05 np0005532763 ovs-ctl[47025]: Starting ovsdb-server [  OK  ]
Nov 23 15:34:05 np0005532763 ovs-vsctl[47074]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 23 15:34:05 np0005532763 ovs-vsctl[47094]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"10e3bf57-dd2d-4b94-851f-925bcd297dde\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 23 15:34:05 np0005532763 ovs-ctl[47025]: Configuring Open vSwitch system IDs [  OK  ]
Nov 23 15:34:05 np0005532763 ovs-ctl[47025]: Enabling remote OVSDB managers [  OK  ]
Nov 23 15:34:05 np0005532763 ovs-vsctl[47100]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Nov 23 15:34:05 np0005532763 systemd[1]: Started Open vSwitch Database Unit.
Nov 23 15:34:05 np0005532763 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 23 15:34:05 np0005532763 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 23 15:34:05 np0005532763 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 23 15:34:05 np0005532763 kernel: openvswitch: Open vSwitch switching datapath
Nov 23 15:34:05 np0005532763 ovs-ctl[47144]: Inserting openvswitch module [  OK  ]
Nov 23 15:34:05 np0005532763 ovs-ctl[47113]: Starting ovs-vswitchd [  OK  ]
Nov 23 15:34:05 np0005532763 ovs-ctl[47113]: Enabling remote OVSDB managers [  OK  ]
Nov 23 15:34:05 np0005532763 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 23 15:34:05 np0005532763 ovs-vsctl[47161]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Nov 23 15:34:05 np0005532763 systemd[1]: Starting Open vSwitch...
Nov 23 15:34:05 np0005532763 systemd[1]: Finished Open vSwitch.
Nov 23 15:34:07 np0005532763 python3.9[47313]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:34:08 np0005532763 python3.9[47465]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 23 15:34:09 np0005532763 kernel: SELinux:  Converting 2742 SID table entries...
Nov 23 15:34:09 np0005532763 kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 15:34:09 np0005532763 kernel: SELinux:  policy capability open_perms=1
Nov 23 15:34:09 np0005532763 kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 15:34:09 np0005532763 kernel: SELinux:  policy capability always_check_network=0
Nov 23 15:34:09 np0005532763 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 15:34:09 np0005532763 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 15:34:09 np0005532763 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 15:34:11 np0005532763 python3.9[47620]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:34:12 np0005532763 dbus-broker-launch[812]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 23 15:34:12 np0005532763 python3.9[47778]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:34:14 np0005532763 python3.9[47931]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:34:16 np0005532763 python3.9[48218]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 15:34:17 np0005532763 python3.9[48368]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:34:18 np0005532763 python3.9[48522]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:34:19 np0005532763 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 15:34:19 np0005532763 systemd[1]: Starting man-db-cache-update.service...
Nov 23 15:34:19 np0005532763 systemd[1]: Reloading.
Nov 23 15:34:20 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:34:20 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:34:20 np0005532763 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 15:34:20 np0005532763 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 15:34:20 np0005532763 systemd[1]: Finished man-db-cache-update.service.
Nov 23 15:34:20 np0005532763 systemd[1]: run-reeaa2cd061ae4ec78262487edf4ed313.service: Deactivated successfully.
Nov 23 15:34:21 np0005532763 python3.9[48839]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:34:21 np0005532763 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 23 15:34:21 np0005532763 systemd[1]: Stopped Network Manager Wait Online.
Nov 23 15:34:21 np0005532763 systemd[1]: Stopping Network Manager Wait Online...
Nov 23 15:34:21 np0005532763 systemd[1]: Stopping Network Manager...
Nov 23 15:34:21 np0005532763 NetworkManager[7199]: <info>  [1763930061.4830] caught SIGTERM, shutting down normally.
Nov 23 15:34:21 np0005532763 NetworkManager[7199]: <info>  [1763930061.4859] dhcp4 (eth0): canceled DHCP transaction
Nov 23 15:34:21 np0005532763 NetworkManager[7199]: <info>  [1763930061.4859] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 23 15:34:21 np0005532763 NetworkManager[7199]: <info>  [1763930061.4860] dhcp4 (eth0): state changed no lease
Nov 23 15:34:21 np0005532763 NetworkManager[7199]: <info>  [1763930061.4868] manager: NetworkManager state is now CONNECTED_SITE
Nov 23 15:34:21 np0005532763 NetworkManager[7199]: <info>  [1763930061.4948] exiting (success)
Nov 23 15:34:21 np0005532763 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 23 15:34:21 np0005532763 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 23 15:34:21 np0005532763 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 23 15:34:21 np0005532763 systemd[1]: Stopped Network Manager.
Nov 23 15:34:21 np0005532763 systemd[1]: NetworkManager.service: Consumed 12.528s CPU time, 4.0M memory peak, read 0B from disk, written 34.5K to disk.
Nov 23 15:34:21 np0005532763 systemd[1]: Starting Network Manager...
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.5820] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:a896f7ce-22ce-43ea-a9ba-7e128facc2bb)
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.5821] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.5889] manager[0x564e15433090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 23 15:34:21 np0005532763 systemd[1]: Starting Hostname Service...
Nov 23 15:34:21 np0005532763 systemd[1]: Started Hostname Service.
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7264] hostname: hostname: using hostnamed
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7265] hostname: static hostname changed from (none) to "compute-2"
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7272] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7279] manager[0x564e15433090]: rfkill: Wi-Fi hardware radio set enabled
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7280] manager[0x564e15433090]: rfkill: WWAN hardware radio set enabled
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7317] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7331] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7332] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7334] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7335] manager: Networking is enabled by state file
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7338] settings: Loaded settings plugin: keyfile (internal)
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7344] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7386] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7401] dhcp: init: Using DHCP client 'internal'
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7408] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7420] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7431] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7448] device (lo): Activation: starting connection 'lo' (ac07c606-0f80-4608-b8dd-99a45b6a547d)
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7462] device (eth0): carrier: link connected
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7471] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7482] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7483] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7497] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7510] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7524] device (eth1): carrier: link connected
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7533] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7543] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (24d1ddd0-0657-595a-9a2d-45d9b7719a8e) (indicated)
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7544] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7556] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7570] device (eth1): Activation: starting connection 'ci-private-network' (24d1ddd0-0657-595a-9a2d-45d9b7719a8e)
Nov 23 15:34:21 np0005532763 systemd[1]: Started Network Manager.
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7580] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7593] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7597] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7599] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7603] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7607] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7610] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7624] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7628] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7661] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7668] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7682] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7708] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7725] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7729] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7740] device (lo): Activation: successful, device activated.
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7754] dhcp4 (eth0): state changed new lease, address=38.102.83.111
Nov 23 15:34:21 np0005532763 systemd[1]: Starting Network Manager Wait Online...
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7779] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7878] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7887] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7897] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7903] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7909] device (eth1): Activation: successful, device activated.
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7927] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7930] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7936] manager: NetworkManager state is now CONNECTED_SITE
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7941] device (eth0): Activation: successful, device activated.
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.7994] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 23 15:34:21 np0005532763 NetworkManager[48849]: <info>  [1763930061.8024] manager: startup complete
Nov 23 15:34:21 np0005532763 systemd[1]: Finished Network Manager Wait Online.
Nov 23 15:34:22 np0005532763 python3.9[49065]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:34:27 np0005532763 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 15:34:27 np0005532763 systemd[1]: Starting man-db-cache-update.service...
Nov 23 15:34:27 np0005532763 systemd[1]: Reloading.
Nov 23 15:34:27 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:34:27 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:34:27 np0005532763 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 15:34:28 np0005532763 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 15:34:28 np0005532763 systemd[1]: Finished man-db-cache-update.service.
Nov 23 15:34:28 np0005532763 systemd[1]: run-r1ffd82e5379a4f2ab7144f130e337003.service: Deactivated successfully.
Nov 23 15:34:29 np0005532763 python3.9[49525]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:34:30 np0005532763 python3.9[49677]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:34:31 np0005532763 python3.9[49831]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:34:31 np0005532763 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 23 15:34:32 np0005532763 python3.9[49983]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:34:33 np0005532763 python3.9[50135]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:34:34 np0005532763 python3.9[50287]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:34:35 np0005532763 python3.9[50439]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:34:35 np0005532763 python3.9[50562]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763930074.5963075-649-73420145848100/.source _original_basename=._h0yyf5u follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:34:36 np0005532763 python3.9[50714]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:34:37 np0005532763 python3.9[50866]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 23 15:34:38 np0005532763 python3.9[51018]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:34:41 np0005532763 python3.9[51445]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 23 15:34:42 np0005532763 ansible-async_wrapper.py[51620]: Invoked with j337567816913 300 /home/zuul/.ansible/tmp/ansible-tmp-1763930081.53431-847-44647860957987/AnsiballZ_edpm_os_net_config.py _
Nov 23 15:34:42 np0005532763 ansible-async_wrapper.py[51623]: Starting module and watcher
Nov 23 15:34:42 np0005532763 ansible-async_wrapper.py[51623]: Start watching 51624 (300)
Nov 23 15:34:42 np0005532763 ansible-async_wrapper.py[51624]: Start module (51624)
Nov 23 15:34:42 np0005532763 ansible-async_wrapper.py[51620]: Return async_wrapper task started.
Nov 23 15:34:42 np0005532763 python3.9[51625]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 23 15:34:43 np0005532763 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 23 15:34:43 np0005532763 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 23 15:34:43 np0005532763 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 23 15:34:43 np0005532763 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 23 15:34:43 np0005532763 kernel: cfg80211: failed to load regulatory.db
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.5933] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.5962] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.6944] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.6947] audit: op="connection-add" uuid="ec9c72a6-47ca-4273-ba30-be21a87ba147" name="br-ex-br" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.6975] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.6977] audit: op="connection-add" uuid="ddcdfedd-bea5-4dc3-9db7-83aee3fbc300" name="br-ex-port" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.6999] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7002] audit: op="connection-add" uuid="42ede959-d9f1-4f2e-bad4-dcd3c0b0ec87" name="eth1-port" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7024] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7027] audit: op="connection-add" uuid="6e6d3bb1-12ed-429c-b5eb-c04176558859" name="vlan20-port" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7047] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7051] audit: op="connection-add" uuid="eeb5c4df-b481-4f01-a326-48776bf56bb8" name="vlan21-port" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7071] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7075] audit: op="connection-add" uuid="23317a1c-6c21-47c3-b5a7-b9c6ee17920f" name="vlan22-port" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7097] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7100] audit: op="connection-add" uuid="699c7737-450e-48f1-a8b2-84278b0b7e03" name="vlan23-port" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7135] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,connection.timestamp,connection.autoconnect-priority,802-3-ethernet.mtu" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7167] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7170] audit: op="connection-add" uuid="33cbfb13-d89f-4065-97eb-17dff740784e" name="br-ex-if" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7244] audit: op="connection-update" uuid="24d1ddd0-0657-595a-9a2d-45d9b7719a8e" name="ci-private-network" args="ipv4.dns,ipv4.addresses,ipv4.never-default,ipv4.routes,ipv4.method,ipv4.routing-rules,ipv6.dns,ipv6.addr-gen-mode,ipv6.addresses,ipv6.routes,ipv6.method,ipv6.routing-rules,connection.controller,connection.port-type,connection.timestamp,connection.slave-type,connection.master,ovs-external-ids.data,ovs-interface.type" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7276] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7280] audit: op="connection-add" uuid="5e6f2e0a-c330-4ad4-9bcf-4802433ad2f8" name="vlan20-if" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7310] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7313] audit: op="connection-add" uuid="ea406c90-a8b3-476c-ad47-3a9d71b5c3d3" name="vlan21-if" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7344] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7349] audit: op="connection-add" uuid="9b9e3963-743f-4ee0-967b-2ac37a98148a" name="vlan22-if" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7379] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7382] audit: op="connection-add" uuid="0f6c5323-42cb-4f0f-8298-8c5b1886572d" name="vlan23-if" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7409] audit: op="connection-delete" uuid="19fc6877-689b-3c5c-bc86-f9bfe6b22958" name="Wired connection 1" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7431] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7447] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7454] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (ec9c72a6-47ca-4273-ba30-be21a87ba147)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7456] audit: op="connection-activate" uuid="ec9c72a6-47ca-4273-ba30-be21a87ba147" name="br-ex-br" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7459] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7472] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7478] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (ddcdfedd-bea5-4dc3-9db7-83aee3fbc300)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7481] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7490] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7496] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (42ede959-d9f1-4f2e-bad4-dcd3c0b0ec87)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7499] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7510] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7515] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (6e6d3bb1-12ed-429c-b5eb-c04176558859)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7518] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7529] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7535] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (eeb5c4df-b481-4f01-a326-48776bf56bb8)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7537] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7548] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7554] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (23317a1c-6c21-47c3-b5a7-b9c6ee17920f)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7557] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7570] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7576] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (699c7737-450e-48f1-a8b2-84278b0b7e03)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7577] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7579] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7580] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7587] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7591] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7595] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (33cbfb13-d89f-4065-97eb-17dff740784e)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7596] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7599] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7600] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7601] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7603] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7614] device (eth1): disconnecting for new activation request.
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7615] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7618] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7620] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7620] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7623] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7627] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7631] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (5e6f2e0a-c330-4ad4-9bcf-4802433ad2f8)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7631] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7634] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7636] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7637] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7639] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7644] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7648] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (ea406c90-a8b3-476c-ad47-3a9d71b5c3d3)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7648] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7651] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7653] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7654] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7657] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7661] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7665] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (9b9e3963-743f-4ee0-967b-2ac37a98148a)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7666] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7669] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7670] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7671] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7673] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7677] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7681] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (0f6c5323-42cb-4f0f-8298-8c5b1886572d)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7682] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7685] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7687] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7688] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7689] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7703] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.addr-gen-mode,ipv6.method,connection.autoconnect-priority,802-3-ethernet.mtu" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7705] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7708] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7709] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7716] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7720] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7725] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7728] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7729] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7735] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7739] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 kernel: ovs-system: entered promiscuous mode
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7742] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7744] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7749] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7753] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7756] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7757] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7764] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7768] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7771] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7773] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7778] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 kernel: Timeout policy base is empty
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7782] dhcp4 (eth0): canceled DHCP transaction
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7782] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7783] dhcp4 (eth0): state changed no lease
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7784] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 23 15:34:45 np0005532763 systemd-udevd[51632]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7796] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7799] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51626 uid=0 result="fail" reason="Device is not activated"
Nov 23 15:34:45 np0005532763 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7847] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7851] dhcp4 (eth0): state changed new lease, address=38.102.83.111
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7895] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7902] device (eth1): disconnecting for new activation request.
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7903] audit: op="connection-activate" uuid="24d1ddd0-0657-595a-9a2d-45d9b7719a8e" name="ci-private-network" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7904] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7912] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Nov 23 15:34:45 np0005532763 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7947] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51626 uid=0 result="success"
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.7956] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8080] device (eth1): Activation: starting connection 'ci-private-network' (24d1ddd0-0657-595a-9a2d-45d9b7719a8e)
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8095] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8098] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8104] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8106] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8107] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8108] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8109] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8110] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8111] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8114] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8124] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8129] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8134] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8139] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8143] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8147] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8151] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8155] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8159] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8164] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8168] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8172] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8176] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8181] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8187] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8191] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8241] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8243] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8249] device (eth1): Activation: successful, device activated.
Nov 23 15:34:45 np0005532763 kernel: br-ex: entered promiscuous mode
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8502] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8528] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 kernel: vlan22: entered promiscuous mode
Nov 23 15:34:45 np0005532763 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8678] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8684] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8692] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 23 15:34:45 np0005532763 systemd-udevd[51630]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 15:34:45 np0005532763 kernel: vlan20: entered promiscuous mode
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8766] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8790] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 kernel: vlan21: entered promiscuous mode
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8824] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8830] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 kernel: vlan23: entered promiscuous mode
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8843] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 23 15:34:45 np0005532763 systemd-udevd[51631]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8899] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8918] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8949] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8951] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.8962] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.9013] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.9016] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.9055] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.9065] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.9107] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.9110] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.9112] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.9123] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.9132] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 23 15:34:45 np0005532763 NetworkManager[48849]: <info>  [1763930085.9143] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 23 15:34:46 np0005532763 python3.9[51984]: ansible-ansible.legacy.async_status Invoked with jid=j337567816913.51620 mode=status _async_dir=/root/.ansible_async
Nov 23 15:34:47 np0005532763 NetworkManager[48849]: <info>  [1763930087.0529] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51626 uid=0 result="success"
Nov 23 15:34:47 np0005532763 NetworkManager[48849]: <info>  [1763930087.3114] checkpoint[0x564e15409950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 23 15:34:47 np0005532763 NetworkManager[48849]: <info>  [1763930087.3117] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51626 uid=0 result="success"
Nov 23 15:34:47 np0005532763 NetworkManager[48849]: <info>  [1763930087.6386] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51626 uid=0 result="success"
Nov 23 15:34:47 np0005532763 NetworkManager[48849]: <info>  [1763930087.6394] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51626 uid=0 result="success"
Nov 23 15:34:47 np0005532763 ansible-async_wrapper.py[51623]: 51624 still running (300)
Nov 23 15:34:47 np0005532763 NetworkManager[48849]: <info>  [1763930087.7959] audit: op="networking-control" arg="global-dns-configuration" pid=51626 uid=0 result="success"
Nov 23 15:34:47 np0005532763 NetworkManager[48849]: <info>  [1763930087.7987] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 23 15:34:47 np0005532763 NetworkManager[48849]: <info>  [1763930087.8014] audit: op="networking-control" arg="global-dns-configuration" pid=51626 uid=0 result="success"
Nov 23 15:34:47 np0005532763 NetworkManager[48849]: <info>  [1763930087.8546] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51626 uid=0 result="success"
Nov 23 15:34:47 np0005532763 NetworkManager[48849]: <info>  [1763930087.9694] checkpoint[0x564e15409a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 23 15:34:47 np0005532763 NetworkManager[48849]: <info>  [1763930087.9697] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51626 uid=0 result="success"
Nov 23 15:34:48 np0005532763 ansible-async_wrapper.py[51624]: Module complete (51624)
Nov 23 15:34:50 np0005532763 python3.9[52091]: ansible-ansible.legacy.async_status Invoked with jid=j337567816913.51620 mode=status _async_dir=/root/.ansible_async
Nov 23 15:34:51 np0005532763 python3.9[52190]: ansible-ansible.legacy.async_status Invoked with jid=j337567816913.51620 mode=cleanup _async_dir=/root/.ansible_async
Nov 23 15:34:51 np0005532763 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 23 15:34:52 np0005532763 python3.9[52344]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:34:52 np0005532763 python3.9[52467]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763930091.435422-928-169007224655270/.source.returncode _original_basename=.d64hi2yi follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:34:52 np0005532763 ansible-async_wrapper.py[51623]: Done in kid B.
Nov 23 15:34:53 np0005532763 python3.9[52620]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:34:54 np0005532763 python3.9[52743]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763930092.94871-976-172833433872167/.source.cfg _original_basename=.83zk_z71 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:34:55 np0005532763 python3.9[52895]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:34:55 np0005532763 systemd[1]: Reloading Network Manager...
Nov 23 15:34:55 np0005532763 NetworkManager[48849]: <info>  [1763930095.3196] audit: op="reload" arg="0" pid=52899 uid=0 result="success"
Nov 23 15:34:55 np0005532763 NetworkManager[48849]: <info>  [1763930095.3208] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 23 15:34:55 np0005532763 systemd[1]: Reloaded Network Manager.
Nov 23 15:34:55 np0005532763 systemd[1]: session-11.scope: Deactivated successfully.
Nov 23 15:34:55 np0005532763 systemd[1]: session-11.scope: Consumed 56.646s CPU time.
Nov 23 15:34:55 np0005532763 systemd-logind[830]: Session 11 logged out. Waiting for processes to exit.
Nov 23 15:34:55 np0005532763 systemd-logind[830]: Removed session 11.
Nov 23 15:35:01 np0005532763 systemd-logind[830]: New session 12 of user zuul.
Nov 23 15:35:01 np0005532763 systemd[1]: Started Session 12 of User zuul.
Nov 23 15:35:02 np0005532763 python3.9[53083]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:35:03 np0005532763 python3.9[53238]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:35:05 np0005532763 python3.9[53431]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:35:05 np0005532763 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 23 15:35:05 np0005532763 systemd[1]: session-12.scope: Deactivated successfully.
Nov 23 15:35:05 np0005532763 systemd[1]: session-12.scope: Consumed 2.978s CPU time.
Nov 23 15:35:05 np0005532763 systemd-logind[830]: Session 12 logged out. Waiting for processes to exit.
Nov 23 15:35:05 np0005532763 systemd-logind[830]: Removed session 12.
Nov 23 15:35:11 np0005532763 systemd-logind[830]: New session 13 of user zuul.
Nov 23 15:35:11 np0005532763 systemd[1]: Started Session 13 of User zuul.
Nov 23 15:35:12 np0005532763 python3.9[53613]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:35:13 np0005532763 python3.9[53768]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:35:14 np0005532763 python3.9[53924]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:35:15 np0005532763 python3.9[54008]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:35:17 np0005532763 python3.9[54162]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:35:19 np0005532763 python3.9[54357]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:35:20 np0005532763 python3.9[54509]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:35:20 np0005532763 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck3272753832-merged.mount: Deactivated successfully.
Nov 23 15:35:20 np0005532763 podman[54510]: 2025-11-23 20:35:20.132170644 +0000 UTC m=+0.070473370 system refresh
Nov 23 15:35:21 np0005532763 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 15:35:21 np0005532763 python3.9[54673]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:35:21 np0005532763 python3.9[54796]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930120.45633-199-208586745402965/.source.json follow=False _original_basename=podman_network_config.j2 checksum=814780fecdffe97f3a4567a9c16ace2c9e1d2235 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:35:22 np0005532763 python3.9[54948]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:35:23 np0005532763 python3.9[55071]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763930122.1012087-244-166670822982713/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:35:24 np0005532763 python3.9[55223]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:35:25 np0005532763 python3.9[55375]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:35:26 np0005532763 python3.9[55527]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:35:26 np0005532763 python3.9[55679]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:35:27 np0005532763 python3.9[55831]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:35:30 np0005532763 python3.9[55984]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:35:31 np0005532763 python3.9[56138]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:35:32 np0005532763 python3.9[56290]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:35:33 np0005532763 python3.9[56442]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:35:34 np0005532763 python3.9[56595]: ansible-service_facts Invoked
Nov 23 15:35:34 np0005532763 network[56612]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 15:35:34 np0005532763 network[56613]: 'network-scripts' will be removed from distribution in near future.
Nov 23 15:35:34 np0005532763 network[56614]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 15:35:40 np0005532763 python3.9[57066]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:35:43 np0005532763 python3.9[57219]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 23 15:35:44 np0005532763 python3.9[57371]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:35:45 np0005532763 python3.9[57496]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763930143.9347796-677-109976909468292/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:35:46 np0005532763 python3.9[57650]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:35:46 np0005532763 python3.9[57775]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763930145.5970263-722-33938836028297/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:35:48 np0005532763 python3.9[57929]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:35:50 np0005532763 python3.9[58083]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:35:51 np0005532763 python3.9[58167]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:35:54 np0005532763 python3.9[58321]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:35:55 np0005532763 python3.9[58405]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:35:55 np0005532763 chronyd[837]: chronyd exiting
Nov 23 15:35:55 np0005532763 systemd[1]: Stopping NTP client/server...
Nov 23 15:35:55 np0005532763 systemd[1]: chronyd.service: Deactivated successfully.
Nov 23 15:35:55 np0005532763 systemd[1]: Stopped NTP client/server.
Nov 23 15:35:55 np0005532763 systemd[1]: Starting NTP client/server...
Nov 23 15:35:55 np0005532763 chronyd[58413]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 23 15:35:55 np0005532763 chronyd[58413]: Frequency -28.228 +/- 0.822 ppm read from /var/lib/chrony/drift
Nov 23 15:35:55 np0005532763 chronyd[58413]: Loaded seccomp filter (level 2)
Nov 23 15:35:55 np0005532763 systemd[1]: Started NTP client/server.
Nov 23 15:35:55 np0005532763 systemd[1]: session-13.scope: Deactivated successfully.
Nov 23 15:35:55 np0005532763 systemd[1]: session-13.scope: Consumed 31.467s CPU time.
Nov 23 15:35:55 np0005532763 systemd-logind[830]: Session 13 logged out. Waiting for processes to exit.
Nov 23 15:35:55 np0005532763 systemd-logind[830]: Removed session 13.
Nov 23 15:36:01 np0005532763 systemd-logind[830]: New session 14 of user zuul.
Nov 23 15:36:01 np0005532763 systemd[1]: Started Session 14 of User zuul.
Nov 23 15:36:02 np0005532763 python3.9[58594]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:03 np0005532763 python3.9[58746]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:36:03 np0005532763 python3.9[58869]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763930162.2786086-64-56670219218538/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:04 np0005532763 systemd[1]: session-14.scope: Deactivated successfully.
Nov 23 15:36:04 np0005532763 systemd[1]: session-14.scope: Consumed 2.148s CPU time.
Nov 23 15:36:04 np0005532763 systemd-logind[830]: Session 14 logged out. Waiting for processes to exit.
Nov 23 15:36:04 np0005532763 systemd-logind[830]: Removed session 14.
Nov 23 15:36:09 np0005532763 systemd-logind[830]: New session 15 of user zuul.
Nov 23 15:36:09 np0005532763 systemd[1]: Started Session 15 of User zuul.
Nov 23 15:36:10 np0005532763 python3.9[59047]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:36:11 np0005532763 python3.9[59203]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:12 np0005532763 python3.9[59378]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:36:13 np0005532763 python3.9[59501]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1763930171.9963586-85-156151427115209/.source.json _original_basename=.fg8r4lfn follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:14 np0005532763 python3.9[59653]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:36:15 np0005532763 python3.9[59776]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763930174.065605-154-213601471826113/.source _original_basename=.jbz21rnh follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:16 np0005532763 python3.9[59928]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:36:16 np0005532763 python3.9[60080]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:36:17 np0005532763 python3.9[60203]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763930176.4181268-227-146245849854087/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:36:18 np0005532763 python3.9[60355]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:36:19 np0005532763 python3.9[60478]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763930177.8390522-227-137841074275523/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:36:19 np0005532763 python3.9[60630]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:20 np0005532763 python3.9[60782]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:36:21 np0005532763 python3.9[60905]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930180.133818-337-52012391095542/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:22 np0005532763 python3.9[61057]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:36:23 np0005532763 python3.9[61180]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930181.6958432-382-7626742189537/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:24 np0005532763 python3.9[61332]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:36:24 np0005532763 systemd[1]: Reloading.
Nov 23 15:36:24 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:36:24 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:36:24 np0005532763 systemd[1]: Reloading.
Nov 23 15:36:24 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:36:24 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:36:24 np0005532763 systemd[1]: Starting EDPM Container Shutdown...
Nov 23 15:36:24 np0005532763 systemd[1]: Finished EDPM Container Shutdown.
Nov 23 15:36:25 np0005532763 python3.9[61560]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:36:26 np0005532763 python3.9[61683]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930185.0109642-451-69211691420228/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:27 np0005532763 python3.9[61835]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:36:27 np0005532763 python3.9[61958]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930186.6399543-497-48738002250809/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:28 np0005532763 python3.9[62110]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:36:28 np0005532763 systemd[1]: Reloading.
Nov 23 15:36:28 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:36:28 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:36:29 np0005532763 systemd[1]: Reloading.
Nov 23 15:36:29 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:36:29 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:36:29 np0005532763 systemd[1]: Starting Create netns directory...
Nov 23 15:36:29 np0005532763 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 15:36:29 np0005532763 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 15:36:29 np0005532763 systemd[1]: Finished Create netns directory.
Nov 23 15:36:30 np0005532763 python3.9[62337]: ansible-ansible.builtin.service_facts Invoked
Nov 23 15:36:30 np0005532763 network[62354]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 15:36:30 np0005532763 network[62355]: 'network-scripts' will be removed from distribution in near future.
Nov 23 15:36:30 np0005532763 network[62356]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 15:36:36 np0005532763 python3.9[62618]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:36:36 np0005532763 systemd[1]: Reloading.
Nov 23 15:36:36 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:36:36 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:36:36 np0005532763 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 23 15:36:36 np0005532763 iptables.init[62658]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 23 15:36:36 np0005532763 iptables.init[62658]: iptables: Flushing firewall rules: [  OK  ]
Nov 23 15:36:36 np0005532763 systemd[1]: iptables.service: Deactivated successfully.
Nov 23 15:36:36 np0005532763 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 23 15:36:37 np0005532763 python3.9[62855]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:36:39 np0005532763 python3.9[63009]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:36:39 np0005532763 systemd[1]: Reloading.
Nov 23 15:36:39 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:36:39 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:36:39 np0005532763 systemd[1]: Starting Netfilter Tables...
Nov 23 15:36:39 np0005532763 systemd[1]: Finished Netfilter Tables.
Nov 23 15:36:40 np0005532763 python3.9[63201]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:36:41 np0005532763 python3.9[63354]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:36:42 np0005532763 python3.9[63479]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763930201.1731148-703-58433126003004/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:43 np0005532763 python3.9[63632]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:36:43 np0005532763 systemd[1]: Reloading OpenSSH server daemon...
Nov 23 15:36:43 np0005532763 systemd[1]: Reloaded OpenSSH server daemon.
Nov 23 15:36:44 np0005532763 python3.9[63788]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:45 np0005532763 python3.9[63940]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:36:45 np0005532763 python3.9[64063]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930204.5222473-797-95084168490863/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:46 np0005532763 python3.9[64215]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 23 15:36:47 np0005532763 systemd[1]: Starting Time & Date Service...
Nov 23 15:36:47 np0005532763 systemd[1]: Started Time & Date Service.
Nov 23 15:36:48 np0005532763 python3.9[64371]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:48 np0005532763 python3.9[64523]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:36:49 np0005532763 python3.9[64646]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763930208.370374-903-178365062341812/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:50 np0005532763 python3.9[64798]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:36:51 np0005532763 python3.9[64921]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763930209.9048035-947-63226587702401/.source.yaml _original_basename=.qf8lqpqs follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:52 np0005532763 python3.9[65073]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:36:52 np0005532763 python3.9[65196]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930211.4382887-992-54775046126365/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:53 np0005532763 python3.9[65348]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:36:54 np0005532763 python3.9[65501]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:36:55 np0005532763 python3[65654]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 23 15:36:56 np0005532763 python3.9[65806]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:36:56 np0005532763 python3.9[65929]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930215.680516-1108-181128945183895/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:57 np0005532763 python3.9[66081]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:36:58 np0005532763 python3.9[66204]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930217.1575627-1153-209130141404466/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:36:59 np0005532763 python3.9[66356]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:37:00 np0005532763 python3.9[66479]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930218.7707663-1199-86510912618602/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:37:01 np0005532763 python3.9[66631]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:37:01 np0005532763 python3.9[66754]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930220.408994-1244-278893111561156/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:37:02 np0005532763 python3.9[66906]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:37:03 np0005532763 python3.9[67029]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930222.0541704-1289-108168286058067/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:37:04 np0005532763 python3.9[67181]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:37:05 np0005532763 python3.9[67333]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:37:06 np0005532763 python3.9[67492]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:37:07 np0005532763 python3.9[67645]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:37:08 np0005532763 python3.9[67797]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:37:09 np0005532763 python3.9[67949]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 23 15:37:10 np0005532763 python3.9[68102]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 23 15:37:10 np0005532763 systemd[1]: session-15.scope: Deactivated successfully.
Nov 23 15:37:10 np0005532763 systemd[1]: session-15.scope: Consumed 46.465s CPU time.
Nov 23 15:37:10 np0005532763 systemd-logind[830]: Session 15 logged out. Waiting for processes to exit.
Nov 23 15:37:10 np0005532763 systemd-logind[830]: Removed session 15.
Nov 23 15:37:15 np0005532763 systemd-logind[830]: New session 16 of user zuul.
Nov 23 15:37:15 np0005532763 systemd[1]: Started Session 16 of User zuul.
Nov 23 15:37:16 np0005532763 python3.9[68283]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 23 15:37:17 np0005532763 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 23 15:37:17 np0005532763 python3.9[68437]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:37:18 np0005532763 python3.9[68589]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:37:20 np0005532763 python3.9[68741]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCZyfELJX7KkP8E4Yo+r9guKNy64TSJDfB+rBUAclCyKwGxjxhBTRAJJCOL6kSBIkbUub9LTNVh+s271jrKlK1rYs22c1DFe3ci9hBERauX4lIaBHw9kJBHURb9cB+VbonXf0hAdqGDLTXdqFnbed2oU0ngSuVesO/C9+SCSZFsfERuUe3/SXKbWfjehgYTi4GquXo6Ynq1HopME6mRR8qGsv6sgdkxpSaUiwtSBG5ONOSyzrev1t2hdDsRxvbZAZgV2ab6IMD9DTKaIXphHpumL6txas+nKViUfm+gW6p6EKNdHb/VLha7ghY3p4LE3OdXM4eytxszF0Fzs/0CXzafNxHjVjHzqxrJBi/PT22i6QD60NTimabHulw8IkZG6KsuNVq1rmlSSGQGjqAs7l6hNH8kF4uq1JwOl6mVgct5iE+ZzhfO5WRWShiE1LlCZpqdYE9VqmBrK5r70N0srW3h2mb4lTAwvC089Vert64D29M7riepyGCrGInpE4aK7Sk=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIFop+sR8mOkxOfCCMKg8Voa+6Ns0zHMRLKg+WdnL56v#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFQ0Rj0/OjRh0AQLkOX0VueFFf3xD5FqSzewSN/8R0Xh0Ybf7bkNUGszKaTkKSUBKR2e9V/GwA+BxEChWtzU3sY=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCrfRiqah4FSYlin2mt3PYchMDfWNjxPXqcCCW7iymA93OXZ1reX9dxsJRSssuxIkwaYv7OC+wrUmMOsDhULhy9uNDku8TnHodZVNms8z3UwQW2GPePqEdQ56rKSJ5DhpY0ly7PapOQ69jitmBGQjsu8go19hV3djXlFm1du9V1HMnfGqyr5REZ5ACjW2Rr0108gdYgrt/xh+1sl7cgixK0vUKaqN47/VJHXSTk20aXknt5lhurSKMbRD4cgP1pz0lBJ8LfEvFajLlXBk7MtsI8L94qtHH20hWUk8P2FmqsM4LoLIY4YkAT6kzDPkNdC5F3bpl67NzNXKLdStChVsjRVgrsR0JhU4YO8nYPSqn85KWQUMsuQhXfeMPb5a0n4vSmF0hQhaTctIIK5Yq+qK3S5Ee0tV+ZLMcrYiRfVJYjULh+8LazeUYBtZAVkOoenlHNpcxfVl2v8Fx37PYu6wY/1Ol7i+Fyg+DMculPNu0E00hYIfuSPW06sm98V0zJ7bs=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIC0+oolG6Djq6MTp/HXh3SEc2a8aDRu5q8AnCiNHx/fN#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBC1GCZqvti/wHDh2Oo7NSAFToY/dykBAXL2bgJmg9kqKO2qTzfIYtCRiGP/x9yaw+D3ymaftMgdHgFkzRtYcXz0=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCo3+sqhh74Wal6wWv19BRNHNnjTPYKculYCUftHSfYmbg5LryLTnsWAJdalXVBYQIJtq5uFrJRBG4C0R1XMU/MT4ZxuTtafwAzeTnKoCHbN/+mH31bndpvGKYRQ9AQHmamquyDQaSEjIYKFaK6eM7uVV/PaSZqasrB6awv3MeDH/GhtlyJwY7ble8M3UtG9jMWuPq/qX+TnKCZI3COyKBCe7F3aeaIewsho+T7qsRd8UNr55SHWJ1N6xYtA4FUayJ4cCZUeo4+SOJuQWb6A3HZm75y0LpdLDFH54DqyDqKVvDUfaKJJQV++3GT9kF9+jrwJDEK9VslSlEylLZ0zg1J0Z2zyMOwOAxBKEUXQNymC+00ybwJd4trP7KDy6+ZGOtHEThBgVO6vtuxQLWhseNa3otNXh7cHTf+Jfo7uo1wHbasd6aD1AVxvt4yKgOGy1ypt9Ps/COlbfHHFYZsI5gVLyJyK8aeipUjJUe6u6Qlf/F/inV1rwRBg8li7oeW7Ss=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFE96kcIFDgsK09K4ZL9HihPRGUmf4YDgXlXqtYy0M8r#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJoWf98fFp9mmY0S22K7n+FjL7cDYCGLm8eglORId7ZBFp9PG5e8P+ws6VWjBbceNazmskqBYurrlrsvB4Mu40E=#012 create=True mode=0644 path=/tmp/ansible.mirurtgx state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:37:20 np0005532763 python3.9[68893]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.mirurtgx' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:37:21 np0005532763 python3.9[69047]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.mirurtgx state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:37:22 np0005532763 systemd[1]: session-16.scope: Deactivated successfully.
Nov 23 15:37:22 np0005532763 systemd[1]: session-16.scope: Consumed 4.597s CPU time.
Nov 23 15:37:22 np0005532763 systemd-logind[830]: Session 16 logged out. Waiting for processes to exit.
Nov 23 15:37:22 np0005532763 systemd-logind[830]: Removed session 16.
Nov 23 15:37:27 np0005532763 systemd-logind[830]: New session 17 of user zuul.
Nov 23 15:37:27 np0005532763 systemd[1]: Started Session 17 of User zuul.
Nov 23 15:37:29 np0005532763 python3.9[69225]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:37:30 np0005532763 python3.9[69381]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 23 15:37:31 np0005532763 python3.9[69535]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:37:32 np0005532763 python3.9[69688]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:37:33 np0005532763 python3.9[69841]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:37:34 np0005532763 python3.9[69995]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:37:35 np0005532763 python3.9[70150]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:37:35 np0005532763 systemd[1]: session-17.scope: Deactivated successfully.
Nov 23 15:37:35 np0005532763 systemd[1]: session-17.scope: Consumed 5.601s CPU time.
Nov 23 15:37:35 np0005532763 systemd-logind[830]: Session 17 logged out. Waiting for processes to exit.
Nov 23 15:37:35 np0005532763 systemd-logind[830]: Removed session 17.
Nov 23 15:37:41 np0005532763 systemd-logind[830]: New session 18 of user zuul.
Nov 23 15:37:41 np0005532763 systemd[1]: Started Session 18 of User zuul.
Nov 23 15:37:42 np0005532763 python3.9[70329]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:37:43 np0005532763 python3.9[70485]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:37:44 np0005532763 python3.9[70569]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 23 15:37:46 np0005532763 python3.9[70720]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:37:47 np0005532763 python3.9[70871]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 23 15:37:48 np0005532763 python3.9[71021]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:37:48 np0005532763 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 15:37:49 np0005532763 python3.9[71172]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:37:49 np0005532763 systemd[1]: session-18.scope: Deactivated successfully.
Nov 23 15:37:49 np0005532763 systemd[1]: session-18.scope: Consumed 6.751s CPU time.
Nov 23 15:37:49 np0005532763 systemd-logind[830]: Session 18 logged out. Waiting for processes to exit.
Nov 23 15:37:49 np0005532763 systemd-logind[830]: Removed session 18.
Nov 23 15:37:58 np0005532763 systemd-logind[830]: New session 19 of user zuul.
Nov 23 15:37:58 np0005532763 systemd[1]: Started Session 19 of User zuul.
Nov 23 15:38:04 np0005532763 python3[71938]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:38:05 np0005532763 chronyd[58413]: Selected source 23.159.16.194 (pool.ntp.org)
Nov 23 15:38:06 np0005532763 python3[72033]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 23 15:38:08 np0005532763 python3[72060]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 23 15:38:08 np0005532763 python3[72086]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:38:08 np0005532763 kernel: loop: module loaded
Nov 23 15:38:08 np0005532763 kernel: loop3: detected capacity change from 0 to 41943040
Nov 23 15:38:09 np0005532763 python3[72122]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:38:09 np0005532763 lvm[72125]: PV /dev/loop3 not used.
Nov 23 15:38:09 np0005532763 lvm[72127]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 23 15:38:09 np0005532763 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Nov 23 15:38:09 np0005532763 lvm[72134]:  1 logical volume(s) in volume group "ceph_vg0" now active
Nov 23 15:38:09 np0005532763 lvm[72137]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 23 15:38:09 np0005532763 lvm[72137]: VG ceph_vg0 finished
Nov 23 15:38:09 np0005532763 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Nov 23 15:38:10 np0005532763 python3[72215]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 23 15:38:10 np0005532763 python3[72288]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763930289.9210145-36962-181410397071846/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:38:11 np0005532763 python3[72338]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:38:11 np0005532763 systemd[1]: Reloading.
Nov 23 15:38:11 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:38:11 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:38:11 np0005532763 systemd[1]: Starting Ceph OSD losetup...
Nov 23 15:38:11 np0005532763 bash[72378]: /dev/loop3: [64513]:4194936 (/var/lib/ceph-osd-0.img)
Nov 23 15:38:11 np0005532763 systemd[1]: Finished Ceph OSD losetup.
Nov 23 15:38:12 np0005532763 lvm[72380]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 23 15:38:12 np0005532763 lvm[72380]: VG ceph_vg0 finished
Nov 23 15:38:14 np0005532763 python3[72404]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:39:48 np0005532763 systemd-logind[830]: New session 20 of user ceph-admin.
Nov 23 15:39:48 np0005532763 systemd[1]: Created slice User Slice of UID 42477.
Nov 23 15:39:48 np0005532763 systemd[1]: Starting User Runtime Directory /run/user/42477...
Nov 23 15:39:48 np0005532763 systemd[1]: Finished User Runtime Directory /run/user/42477.
Nov 23 15:39:48 np0005532763 systemd[1]: Starting User Manager for UID 42477...
Nov 23 15:39:48 np0005532763 systemd[72454]: Queued start job for default target Main User Target.
Nov 23 15:39:48 np0005532763 systemd-logind[830]: New session 22 of user ceph-admin.
Nov 23 15:39:48 np0005532763 systemd[72454]: Created slice User Application Slice.
Nov 23 15:39:48 np0005532763 systemd[72454]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 23 15:39:48 np0005532763 systemd[72454]: Started Daily Cleanup of User's Temporary Directories.
Nov 23 15:39:48 np0005532763 systemd[72454]: Reached target Paths.
Nov 23 15:39:48 np0005532763 systemd[72454]: Reached target Timers.
Nov 23 15:39:48 np0005532763 systemd[72454]: Starting D-Bus User Message Bus Socket...
Nov 23 15:39:48 np0005532763 systemd[72454]: Starting Create User's Volatile Files and Directories...
Nov 23 15:39:48 np0005532763 systemd[72454]: Listening on D-Bus User Message Bus Socket.
Nov 23 15:39:48 np0005532763 systemd[72454]: Reached target Sockets.
Nov 23 15:39:48 np0005532763 systemd[72454]: Finished Create User's Volatile Files and Directories.
Nov 23 15:39:48 np0005532763 systemd[72454]: Reached target Basic System.
Nov 23 15:39:48 np0005532763 systemd[72454]: Reached target Main User Target.
Nov 23 15:39:48 np0005532763 systemd[72454]: Startup finished in 139ms.
Nov 23 15:39:48 np0005532763 systemd[1]: Started User Manager for UID 42477.
Nov 23 15:39:48 np0005532763 systemd[1]: Started Session 20 of User ceph-admin.
Nov 23 15:39:48 np0005532763 systemd[1]: Started Session 22 of User ceph-admin.
Nov 23 15:39:48 np0005532763 systemd-logind[830]: New session 23 of user ceph-admin.
Nov 23 15:39:48 np0005532763 systemd[1]: Started Session 23 of User ceph-admin.
Nov 23 15:39:49 np0005532763 systemd-logind[830]: New session 24 of user ceph-admin.
Nov 23 15:39:49 np0005532763 systemd[1]: Started Session 24 of User ceph-admin.
Nov 23 15:39:49 np0005532763 systemd-logind[830]: New session 25 of user ceph-admin.
Nov 23 15:39:49 np0005532763 systemd[1]: Started Session 25 of User ceph-admin.
Nov 23 15:39:49 np0005532763 systemd-logind[830]: New session 26 of user ceph-admin.
Nov 23 15:39:49 np0005532763 systemd[1]: Started Session 26 of User ceph-admin.
Nov 23 15:39:50 np0005532763 systemd-logind[830]: New session 27 of user ceph-admin.
Nov 23 15:39:50 np0005532763 systemd[1]: Started Session 27 of User ceph-admin.
Nov 23 15:39:50 np0005532763 systemd-logind[830]: New session 28 of user ceph-admin.
Nov 23 15:39:50 np0005532763 systemd[1]: Started Session 28 of User ceph-admin.
Nov 23 15:39:51 np0005532763 systemd-logind[830]: New session 29 of user ceph-admin.
Nov 23 15:39:51 np0005532763 systemd[1]: Started Session 29 of User ceph-admin.
Nov 23 15:39:51 np0005532763 systemd-logind[830]: New session 30 of user ceph-admin.
Nov 23 15:39:51 np0005532763 systemd[1]: Started Session 30 of User ceph-admin.
Nov 23 15:39:52 np0005532763 systemd-logind[830]: New session 31 of user ceph-admin.
Nov 23 15:39:52 np0005532763 systemd[1]: Started Session 31 of User ceph-admin.
Nov 23 15:39:52 np0005532763 systemd-logind[830]: New session 32 of user ceph-admin.
Nov 23 15:39:52 np0005532763 systemd[1]: Started Session 32 of User ceph-admin.
Nov 23 15:39:53 np0005532763 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 15:40:35 np0005532763 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 15:40:36 np0005532763 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 15:40:36 np0005532763 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 15:40:36 np0005532763 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 15:40:36 np0005532763 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73053 (sysctl)
Nov 23 15:40:36 np0005532763 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 23 15:40:36 np0005532763 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 23 15:40:37 np0005532763 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 15:40:38 np0005532763 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 15:40:40 np0005532763 systemd[1]: var-lib-containers-storage-overlay-compat782866743-merged.mount: Deactivated successfully.
Nov 23 15:40:40 np0005532763 systemd[1]: var-lib-containers-storage-overlay-compat782866743-lower\x2dmapped.mount: Deactivated successfully.
Nov 23 15:40:55 np0005532763 podman[73234]: 2025-11-23 20:40:55.970618936 +0000 UTC m=+17.542573417 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:40:55 np0005532763 podman[73234]: 2025-11-23 20:40:55.989973654 +0000 UTC m=+17.561928105 container create 413ca88e0f306f1d35ac6ceaa7c9b2ac7ed1e1add8e434fbece0a321cf189571 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_wing, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2)
Nov 23 15:40:56 np0005532763 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 23 15:40:56 np0005532763 systemd[1]: Started libpod-conmon-413ca88e0f306f1d35ac6ceaa7c9b2ac7ed1e1add8e434fbece0a321cf189571.scope.
Nov 23 15:40:56 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:40:56 np0005532763 podman[73234]: 2025-11-23 20:40:56.133601976 +0000 UTC m=+17.705556487 container init 413ca88e0f306f1d35ac6ceaa7c9b2ac7ed1e1add8e434fbece0a321cf189571 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_wing, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Nov 23 15:40:56 np0005532763 podman[73234]: 2025-11-23 20:40:56.144639641 +0000 UTC m=+17.716594122 container start 413ca88e0f306f1d35ac6ceaa7c9b2ac7ed1e1add8e434fbece0a321cf189571 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_wing, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True)
Nov 23 15:40:56 np0005532763 podman[73234]: 2025-11-23 20:40:56.148899591 +0000 UTC m=+17.720854082 container attach 413ca88e0f306f1d35ac6ceaa7c9b2ac7ed1e1add8e434fbece0a321cf189571 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_wing, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:40:56 np0005532763 stupefied_wing[73294]: 167 167
Nov 23 15:40:56 np0005532763 systemd[1]: libpod-413ca88e0f306f1d35ac6ceaa7c9b2ac7ed1e1add8e434fbece0a321cf189571.scope: Deactivated successfully.
Nov 23 15:40:56 np0005532763 podman[73234]: 2025-11-23 20:40:56.155953697 +0000 UTC m=+17.727908178 container died 413ca88e0f306f1d35ac6ceaa7c9b2ac7ed1e1add8e434fbece0a321cf189571 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_wing, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:40:56 np0005532763 systemd[1]: var-lib-containers-storage-overlay-365570739a11b88f39d24eba9a07373a83ab3670fffb71efbbcd52185f98b106-merged.mount: Deactivated successfully.
Nov 23 15:40:56 np0005532763 podman[73234]: 2025-11-23 20:40:56.206147005 +0000 UTC m=+17.778101496 container remove 413ca88e0f306f1d35ac6ceaa7c9b2ac7ed1e1add8e434fbece0a321cf189571 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=stupefied_wing, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Nov 23 15:40:56 np0005532763 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 15:40:56 np0005532763 systemd[1]: libpod-conmon-413ca88e0f306f1d35ac6ceaa7c9b2ac7ed1e1add8e434fbece0a321cf189571.scope: Deactivated successfully.
Nov 23 15:40:56 np0005532763 podman[73316]: 2025-11-23 20:40:56.47933273 +0000 UTC m=+0.083316675 container create ad9ea835bedb49440ca7b9fe3b5319a56b51cdbe59cd01cb42c5a5b7fc48680b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_khayyam, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:40:56 np0005532763 podman[73316]: 2025-11-23 20:40:56.445243188 +0000 UTC m=+0.049227203 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:40:56 np0005532763 systemd[1]: Started libpod-conmon-ad9ea835bedb49440ca7b9fe3b5319a56b51cdbe59cd01cb42c5a5b7fc48680b.scope.
Nov 23 15:40:56 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:40:56 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed7fd309cffd4927042bb093fdfb1f718a9529228c8ba8d2574f286baf478552/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 15:40:56 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed7fd309cffd4927042bb093fdfb1f718a9529228c8ba8d2574f286baf478552/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 15:40:56 np0005532763 podman[73316]: 2025-11-23 20:40:56.595719507 +0000 UTC m=+0.199703462 container init ad9ea835bedb49440ca7b9fe3b5319a56b51cdbe59cd01cb42c5a5b7fc48680b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_khayyam, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Nov 23 15:40:56 np0005532763 podman[73316]: 2025-11-23 20:40:56.608248569 +0000 UTC m=+0.212232514 container start ad9ea835bedb49440ca7b9fe3b5319a56b51cdbe59cd01cb42c5a5b7fc48680b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_khayyam, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:40:56 np0005532763 podman[73316]: 2025-11-23 20:40:56.613676003 +0000 UTC m=+0.217660028 container attach ad9ea835bedb49440ca7b9fe3b5319a56b51cdbe59cd01cb42c5a5b7fc48680b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_khayyam, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]: [
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:    {
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:        "available": false,
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:        "being_replaced": false,
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:        "ceph_device_lvm": false,
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:        "device_id": "QEMU_DVD-ROM_QM00001",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:        "lsm_data": {},
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:        "lvs": [],
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:        "path": "/dev/sr0",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:        "rejected_reasons": [
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "Insufficient space (<5GB)",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "Has a FileSystem"
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:        ],
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:        "sys_api": {
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "actuators": null,
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "device_nodes": [
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:                "sr0"
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            ],
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "devname": "sr0",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "human_readable_size": "482.00 KB",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "id_bus": "ata",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "model": "QEMU DVD-ROM",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "nr_requests": "2",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "parent": "/dev/sr0",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "partitions": {},
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "path": "/dev/sr0",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "removable": "1",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "rev": "2.5+",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "ro": "0",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "rotational": "1",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "sas_address": "",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "sas_device_handle": "",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "scheduler_mode": "mq-deadline",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "sectors": 0,
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "sectorsize": "2048",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "size": 493568.0,
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "support_discard": "2048",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "type": "disk",
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:            "vendor": "QEMU"
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:        }
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]:    }
Nov 23 15:40:57 np0005532763 sad_khayyam[73332]: ]
Nov 23 15:40:57 np0005532763 systemd[1]: libpod-ad9ea835bedb49440ca7b9fe3b5319a56b51cdbe59cd01cb42c5a5b7fc48680b.scope: Deactivated successfully.
Nov 23 15:40:57 np0005532763 systemd[1]: libpod-ad9ea835bedb49440ca7b9fe3b5319a56b51cdbe59cd01cb42c5a5b7fc48680b.scope: Consumed 1.021s CPU time.
Nov 23 15:40:57 np0005532763 conmon[73332]: conmon ad9ea835bedb49440ca7 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ad9ea835bedb49440ca7b9fe3b5319a56b51cdbe59cd01cb42c5a5b7fc48680b.scope/container/memory.events
Nov 23 15:40:57 np0005532763 podman[73316]: 2025-11-23 20:40:57.61964202 +0000 UTC m=+1.223625975 container died ad9ea835bedb49440ca7b9fe3b5319a56b51cdbe59cd01cb42c5a5b7fc48680b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_khayyam, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Nov 23 15:40:57 np0005532763 systemd[1]: var-lib-containers-storage-overlay-ed7fd309cffd4927042bb093fdfb1f718a9529228c8ba8d2574f286baf478552-merged.mount: Deactivated successfully.
Nov 23 15:40:57 np0005532763 podman[73316]: 2025-11-23 20:40:57.674186871 +0000 UTC m=+1.278170786 container remove ad9ea835bedb49440ca7b9fe3b5319a56b51cdbe59cd01cb42c5a5b7fc48680b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sad_khayyam, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Nov 23 15:40:57 np0005532763 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 15:40:57 np0005532763 systemd[1]: libpod-conmon-ad9ea835bedb49440ca7b9fe3b5319a56b51cdbe59cd01cb42c5a5b7fc48680b.scope: Deactivated successfully.
Nov 23 15:41:01 np0005532763 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 15:41:01 np0005532763 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 15:41:01 np0005532763 podman[75407]: 2025-11-23 20:41:01.811394891 +0000 UTC m=+0.067593264 container create 23b86c23447a8a350ce1a33adc0131a3f2c2a554a69c58cbfcfb9e68a3865acb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_visvesvaraya, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 15:41:01 np0005532763 systemd[1]: Started libpod-conmon-23b86c23447a8a350ce1a33adc0131a3f2c2a554a69c58cbfcfb9e68a3865acb.scope.
Nov 23 15:41:01 np0005532763 podman[75407]: 2025-11-23 20:41:01.781966314 +0000 UTC m=+0.038164727 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:01 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:41:01 np0005532763 podman[75407]: 2025-11-23 20:41:01.908807774 +0000 UTC m=+0.165006198 container init 23b86c23447a8a350ce1a33adc0131a3f2c2a554a69c58cbfcfb9e68a3865acb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_visvesvaraya, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:41:01 np0005532763 podman[75407]: 2025-11-23 20:41:01.920226264 +0000 UTC m=+0.176424627 container start 23b86c23447a8a350ce1a33adc0131a3f2c2a554a69c58cbfcfb9e68a3865acb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_visvesvaraya, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2)
Nov 23 15:41:01 np0005532763 podman[75407]: 2025-11-23 20:41:01.924862508 +0000 UTC m=+0.181060931 container attach 23b86c23447a8a350ce1a33adc0131a3f2c2a554a69c58cbfcfb9e68a3865acb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_visvesvaraya, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:41:01 np0005532763 objective_visvesvaraya[75423]: 167 167
Nov 23 15:41:01 np0005532763 systemd[1]: libpod-23b86c23447a8a350ce1a33adc0131a3f2c2a554a69c58cbfcfb9e68a3865acb.scope: Deactivated successfully.
Nov 23 15:41:01 np0005532763 conmon[75423]: conmon 23b86c23447a8a350ce1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-23b86c23447a8a350ce1a33adc0131a3f2c2a554a69c58cbfcfb9e68a3865acb.scope/container/memory.events
Nov 23 15:41:01 np0005532763 podman[75407]: 2025-11-23 20:41:01.92915641 +0000 UTC m=+0.185354773 container died 23b86c23447a8a350ce1a33adc0131a3f2c2a554a69c58cbfcfb9e68a3865acb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_visvesvaraya, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Nov 23 15:41:01 np0005532763 podman[75407]: 2025-11-23 20:41:01.982227307 +0000 UTC m=+0.238425670 container remove 23b86c23447a8a350ce1a33adc0131a3f2c2a554a69c58cbfcfb9e68a3865acb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_visvesvaraya, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:41:01 np0005532763 systemd[1]: libpod-conmon-23b86c23447a8a350ce1a33adc0131a3f2c2a554a69c58cbfcfb9e68a3865acb.scope: Deactivated successfully.
Nov 23 15:41:02 np0005532763 podman[75440]: 2025-11-23 20:41:02.081082378 +0000 UTC m=+0.065081478 container create 02fdb3055d2929c67f6c99a1bbe3d7b49d715f3a5054ad728da7f1237acb8597 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_greider, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:41:02 np0005532763 systemd[1]: Started libpod-conmon-02fdb3055d2929c67f6c99a1bbe3d7b49d715f3a5054ad728da7f1237acb8597.scope.
Nov 23 15:41:02 np0005532763 podman[75440]: 2025-11-23 20:41:02.053394426 +0000 UTC m=+0.037393566 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:02 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:41:02 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f94bf05716a5a0b1666d0cbc1e64d9ca3db71047f79320759dc04442b41c54a7/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:02 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f94bf05716a5a0b1666d0cbc1e64d9ca3db71047f79320759dc04442b41c54a7/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:02 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f94bf05716a5a0b1666d0cbc1e64d9ca3db71047f79320759dc04442b41c54a7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:02 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f94bf05716a5a0b1666d0cbc1e64d9ca3db71047f79320759dc04442b41c54a7/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:02 np0005532763 podman[75440]: 2025-11-23 20:41:02.18283129 +0000 UTC m=+0.166830440 container init 02fdb3055d2929c67f6c99a1bbe3d7b49d715f3a5054ad728da7f1237acb8597 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_greider, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Nov 23 15:41:02 np0005532763 podman[75440]: 2025-11-23 20:41:02.199922482 +0000 UTC m=+0.183921572 container start 02fdb3055d2929c67f6c99a1bbe3d7b49d715f3a5054ad728da7f1237acb8597 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_greider, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Nov 23 15:41:02 np0005532763 podman[75440]: 2025-11-23 20:41:02.204377368 +0000 UTC m=+0.188376528 container attach 02fdb3055d2929c67f6c99a1bbe3d7b49d715f3a5054ad728da7f1237acb8597 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_greider, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Nov 23 15:41:02 np0005532763 systemd[1]: libpod-02fdb3055d2929c67f6c99a1bbe3d7b49d715f3a5054ad728da7f1237acb8597.scope: Deactivated successfully.
Nov 23 15:41:02 np0005532763 podman[75440]: 2025-11-23 20:41:02.308147087 +0000 UTC m=+0.292146217 container died 02fdb3055d2929c67f6c99a1bbe3d7b49d715f3a5054ad728da7f1237acb8597 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_greider, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Nov 23 15:41:02 np0005532763 podman[75440]: 2025-11-23 20:41:02.360462304 +0000 UTC m=+0.344461404 container remove 02fdb3055d2929c67f6c99a1bbe3d7b49d715f3a5054ad728da7f1237acb8597 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_greider, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Nov 23 15:41:02 np0005532763 systemd[1]: libpod-conmon-02fdb3055d2929c67f6c99a1bbe3d7b49d715f3a5054ad728da7f1237acb8597.scope: Deactivated successfully.
Nov 23 15:41:02 np0005532763 systemd[1]: Reloading.
Nov 23 15:41:02 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:41:02 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:41:02 np0005532763 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 15:41:02 np0005532763 systemd[1]: Reloading.
Nov 23 15:41:02 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:41:02 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:41:02 np0005532763 systemd[1]: Reached target All Ceph clusters and services.
Nov 23 15:41:02 np0005532763 systemd[1]: Reloading.
Nov 23 15:41:03 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:41:03 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:41:03 np0005532763 systemd[1]: Reached target Ceph cluster 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:41:03 np0005532763 systemd[1]: Reloading.
Nov 23 15:41:03 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:41:03 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:41:03 np0005532763 systemd[1]: Reloading.
Nov 23 15:41:03 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:41:03 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:41:03 np0005532763 systemd[1]: Created slice Slice /system/ceph-03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:41:03 np0005532763 systemd[1]: Reached target System Time Set.
Nov 23 15:41:03 np0005532763 systemd[1]: Reached target System Time Synchronized.
Nov 23 15:41:03 np0005532763 systemd[1]: Starting Ceph mon.compute-2 for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:41:03 np0005532763 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 15:41:03 np0005532763 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 23 15:41:04 np0005532763 podman[75733]: 2025-11-23 20:41:04.122223822 +0000 UTC m=+0.051111524 container create 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Nov 23 15:41:04 np0005532763 podman[75733]: 2025-11-23 20:41:04.101882197 +0000 UTC m=+0.030769899 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:04 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9d4ef00c27daae15860382de660be0034f2c601cb52fcfe793b87f51a02d9e2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:04 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9d4ef00c27daae15860382de660be0034f2c601cb52fcfe793b87f51a02d9e2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:04 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9d4ef00c27daae15860382de660be0034f2c601cb52fcfe793b87f51a02d9e2/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:04 np0005532763 podman[75733]: 2025-11-23 20:41:04.215551186 +0000 UTC m=+0.144438908 container init 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 23 15:41:04 np0005532763 podman[75733]: 2025-11-23 20:41:04.229214892 +0000 UTC m=+0.158102574 container start 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Nov 23 15:41:04 np0005532763 bash[75733]: 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f
Nov 23 15:41:04 np0005532763 systemd[1]: Started Ceph mon.compute-2 for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: set uid:gid to 167:167 (ceph:ceph)
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: pidfile_write: ignore empty --pid-file
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: load: jerasure load: lrc 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: RocksDB version: 7.9.2
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Git sha 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Compile date 2025-07-17 03:12:14
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: DB SUMMARY
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: DB Session ID:  9X7YYXRZ70MLDLQBPDMX
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: CURRENT file:  CURRENT
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: IDENTITY file:  IDENTITY
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-2/store.db dir, Total Num: 0, files: 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-2/store.db: 000004.log size: 511 ; 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                         Options.error_if_exists: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                       Options.create_if_missing: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                         Options.paranoid_checks: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                                     Options.env: 0x55e7cf98cc20
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                                      Options.fs: PosixFileSystem
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                                Options.info_log: 0x55e7d0ce5a20
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                Options.max_file_opening_threads: 16
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                              Options.statistics: (nil)
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                               Options.use_fsync: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                       Options.max_log_file_size: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                         Options.allow_fallocate: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                        Options.use_direct_reads: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:          Options.create_missing_column_families: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                              Options.db_log_dir: 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                                 Options.wal_dir: 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                   Options.advise_random_on_open: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                    Options.write_buffer_manager: 0x55e7d0ce9900
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                            Options.rate_limiter: (nil)
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                  Options.unordered_write: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                               Options.row_cache: None
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                              Options.wal_filter: None
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.allow_ingest_behind: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.two_write_queues: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.manual_wal_flush: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.wal_compression: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.atomic_flush: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                 Options.log_readahead_size: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.allow_data_in_errors: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.db_host_id: __hostname__
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.max_background_jobs: 2
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.max_background_compactions: -1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.max_subcompactions: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.max_total_wal_size: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                          Options.max_open_files: -1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                          Options.bytes_per_sync: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:       Options.compaction_readahead_size: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                  Options.max_background_flushes: -1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Compression algorithms supported:
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: #011kZSTD supported: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: #011kXpressCompression supported: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: #011kBZip2Compression supported: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: #011kLZ4Compression supported: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: #011kZlibCompression supported: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: #011kLZ4HCCompression supported: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: #011kSnappyCompression supported: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:           Options.merge_operator: 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e7d0ce45c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55e7d0d09350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:        Options.write_buffer_size: 33554432
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:  Options.max_write_buffer_number: 2
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:          Options.compression: NoCompression
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 2623c212-36b1-4df9-b695-1a7be3fdfc0c
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930464301998, "job": 1, "event": "recovery_started", "wal_files": [4]}
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930464304429, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930464304544, "job": 1, "event": "recovery_finished"}
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55e7d0d0ae00
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: DB pointer 0x55e7d0e14000
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.11 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.11 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e7d0d09350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2 does not exist in monmap, will attempt to join an existing cluster
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: using public_addr v2:192.168.122.102:0/0 -> [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: starting mon.compute-2 rank -1 at public addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] at bind addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-2 fsid 03808be8-ae4a-5548-82e6-4a294f1bc627
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(???) e0 preinit fsid 03808be8-ae4a-5548-82e6-4a294f1bc627
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).mds e1 new map
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).mds e1 print_map#012e1#012btime 2025-11-23T20:38:56:367641+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: -1#012 #012No filesystems configured
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e8 e8: 2 total, 1 up, 2 in
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e9 e9: 2 total, 2 up, 2 in
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e10 e10: 2 total, 2 up, 2 in
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e15 e15: 2 total, 2 up, 2 in
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e16 e16: 2 total, 2 up, 2 in
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e16 crush map has features 3314933000852226048, adjusting msgr requires
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e16 crush map has features 288514051259236352, adjusting msgr requires
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e16 crush map has features 288514051259236352, adjusting msgr requires
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).osd e16 crush map has features 288514051259236352, adjusting msgr requires
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Updating compute-1:/etc/ceph/ceph.conf
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Updating compute-1:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.conf
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Updating compute-1:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.client.admin.keyring
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Failed to apply mon spec MONSpec.from_json(yaml.safe_load('''service_type: mon#012service_name: mon#012placement:#012  hosts:#012  - compute-0#012  - compute-1#012  - compute-2#012''')): Cannot place <MONSpec for service_name=mon> on compute-2: Unknown hosts
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Failed to apply mgr spec ServiceSpec.from_json(yaml.safe_load('''service_type: mgr#012service_name: mgr#012placement:#012  hosts:#012  - compute-0#012  - compute-1#012  - compute-2#012''')): Cannot place <ServiceSpec for service_name=mgr> on compute-2: Unknown hosts
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Deploying daemon crash.compute-1 on compute-1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Health check failed: Failed to apply 2 service(s): mon,mgr (CEPHADM_APPLY_SPEC_FAIL)
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.101:0/2074746697' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "f9775703-f092-47d3-b1e4-23e694631322"}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/459267552' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "71c99843-04fc-447b-a9fd-4e17520a545c"}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.101:0/2074746697' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "f9775703-f092-47d3-b1e4-23e694631322"}]': finished
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/459267552' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "71c99843-04fc-447b-a9fd-4e17520a545c"}]': finished
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Health check cleared: TOO_FEW_OSDS (was: OSD count 0 < osd_pool_default_size 1)
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Deploying daemon osd.0 on compute-1
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Deploying daemon osd.1 on compute-0
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='osd.0 [v2:192.168.122.101:6800/220289678,v1:192.168.122.101:6801/220289678]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='osd.1 [v2:192.168.122.100:6802/2449545263,v1:192.168.122.100:6803/2449545263]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='osd.0 [v2:192.168.122.101:6800/220289678,v1:192.168.122.101:6801/220289678]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='osd.1 [v2:192.168.122.100:6802/2449545263,v1:192.168.122.100:6803/2449545263]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='osd.1 [v2:192.168.122.100:6802/2449545263,v1:192.168.122.100:6803/2449545263]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='osd.0 [v2:192.168.122.101:6800/220289678,v1:192.168.122.101:6801/220289678]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-1", "root=default"]}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='osd.1 [v2:192.168.122.100:6802/2449545263,v1:192.168.122.100:6803/2449545263]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='osd.0 [v2:192.168.122.101:6800/220289678,v1:192.168.122.101:6801/220289678]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-1", "root=default"]}]': finished
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Adjusting osd_memory_target on compute-0 to 127.9M
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Unable to set osd_memory_target on compute-0 to 134211993: error parsing value: Value '134211993' is below minimum 939524096
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Adjusting osd_memory_target on compute-1 to  5248M
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: OSD bench result of 2031.118864 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: osd.1 [v2:192.168.122.100:6802/2449545263,v1:192.168.122.100:6803/2449545263] boot
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: OSD bench result of 6269.861471 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: osd.0 [v2:192.168.122.101:6800/220289678,v1:192.168.122.101:6801/220289678] boot
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Updating compute-2:/etc/ceph/ceph.conf
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1130454146' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Updating compute-2:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.conf
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1130454146' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1425917096' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1425917096' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Updating compute-2:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.client.admin.keyring
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/4197123902' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/4197123902' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Deploying daemon mon.compute-2 on compute-2
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1651014750' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 23 15:41:04 np0005532763 ceph-mon[75752]: mon.compute-2@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Nov 23 15:41:06 np0005532763 ceph-mon[75752]: mon.compute-2@-1(probing) e1  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Nov 23 15:41:06 np0005532763 ceph-mon[75752]: mon.compute-2@-1(probing) e1  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Nov 23 15:41:06 np0005532763 ceph-mon[75752]: mon.compute-2@-1(probing) e2  my rank is now 1 (was -1)
Nov 23 15:41:06 np0005532763 ceph-mon[75752]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Nov 23 15:41:06 np0005532763 ceph-mon[75752]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Nov 23 15:41:06 np0005532763 ceph-mon[75752]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 15:41:08 np0005532763 ceph-mon[75752]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: mgrc update_daemon_metadata mon.compute-2 metadata {addrs=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-2,kernel_description=#1 SMP PREEMPT_DYNAMIC Sat Nov 15 10:30:41 UTC 2025,kernel_version=5.14.0-639.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864320,os=Linux}
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e17 e17: 2 total, 2 up, 2 in
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: Deploying daemon mon.compute-1 on compute-1
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: mon.compute-0 calling monitor election
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: mon.compute-2 calling monitor election
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: Health detail: HEALTH_WARN 3 pool(s) do not have an application enabled
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: [WRN] POOL_APP_NOT_ENABLED: 3 pool(s) do not have an application enabled
Nov 23 15:41:09 np0005532763 ceph-mon[75752]:    application not enabled on pool 'vms'
Nov 23 15:41:09 np0005532763 ceph-mon[75752]:    application not enabled on pool 'volumes'
Nov 23 15:41:09 np0005532763 ceph-mon[75752]:    application not enabled on pool 'backups'
Nov 23 15:41:09 np0005532763 ceph-mon[75752]:    use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:09 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.jtkauz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Nov 23 15:41:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Nov 23 15:41:10 np0005532763 ceph-mon[75752]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Nov 23 15:41:10 np0005532763 ceph-mon[75752]: paxos.1).electionLogic(10) init, last seen epoch 10
Nov 23 15:41:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 15:41:10 np0005532763 podman[75882]: 2025-11-23 20:41:10.304500013 +0000 UTC m=+0.071095918 container create 0227340c5d3db953805757004d9c8e8f590d3de1319fcb01b1492e7bcd721569 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_elgamal, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:41:10 np0005532763 systemd[1]: Started libpod-conmon-0227340c5d3db953805757004d9c8e8f590d3de1319fcb01b1492e7bcd721569.scope.
Nov 23 15:41:10 np0005532763 podman[75882]: 2025-11-23 20:41:10.27108233 +0000 UTC m=+0.037678315 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:10 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:41:10 np0005532763 podman[75882]: 2025-11-23 20:41:10.424258093 +0000 UTC m=+0.190853998 container init 0227340c5d3db953805757004d9c8e8f590d3de1319fcb01b1492e7bcd721569 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_elgamal, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 23 15:41:10 np0005532763 podman[75882]: 2025-11-23 20:41:10.436656753 +0000 UTC m=+0.203252658 container start 0227340c5d3db953805757004d9c8e8f590d3de1319fcb01b1492e7bcd721569 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_elgamal, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Nov 23 15:41:10 np0005532763 podman[75882]: 2025-11-23 20:41:10.440601365 +0000 UTC m=+0.207197280 container attach 0227340c5d3db953805757004d9c8e8f590d3de1319fcb01b1492e7bcd721569 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_elgamal, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:41:10 np0005532763 wonderful_elgamal[75899]: 167 167
Nov 23 15:41:10 np0005532763 systemd[1]: libpod-0227340c5d3db953805757004d9c8e8f590d3de1319fcb01b1492e7bcd721569.scope: Deactivated successfully.
Nov 23 15:41:10 np0005532763 podman[75882]: 2025-11-23 20:41:10.447122399 +0000 UTC m=+0.213718294 container died 0227340c5d3db953805757004d9c8e8f590d3de1319fcb01b1492e7bcd721569 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_elgamal, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325)
Nov 23 15:41:10 np0005532763 systemd[1]: var-lib-containers-storage-overlay-4400771c55adbea13dd321a0801d1c7549cd12423c936a9fb49e6b4305852891-merged.mount: Deactivated successfully.
Nov 23 15:41:10 np0005532763 podman[75882]: 2025-11-23 20:41:10.489981139 +0000 UTC m=+0.256577054 container remove 0227340c5d3db953805757004d9c8e8f590d3de1319fcb01b1492e7bcd721569 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=wonderful_elgamal, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 15:41:10 np0005532763 systemd[1]: libpod-conmon-0227340c5d3db953805757004d9c8e8f590d3de1319fcb01b1492e7bcd721569.scope: Deactivated successfully.
Nov 23 15:41:10 np0005532763 systemd[1]: Reloading.
Nov 23 15:41:10 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:41:10 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:41:10 np0005532763 systemd[1]: Reloading.
Nov 23 15:41:10 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:41:10 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:41:11 np0005532763 systemd[1]: Starting Ceph mgr.compute-2.jtkauz for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:41:11 np0005532763 podman[76043]: 2025-11-23 20:41:11.502599331 +0000 UTC m=+0.070661945 container create 21c1b17ca8177b004aadf1c5f4484b29d858350a608be5758ce46c5c1cc43272 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 23 15:41:11 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84cc05f68de77337695e51ee936f6f5c7ea09580532579950988567b3342f502/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:11 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84cc05f68de77337695e51ee936f6f5c7ea09580532579950988567b3342f502/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:11 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84cc05f68de77337695e51ee936f6f5c7ea09580532579950988567b3342f502/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:11 np0005532763 podman[76043]: 2025-11-23 20:41:11.471320118 +0000 UTC m=+0.039382772 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:11 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/84cc05f68de77337695e51ee936f6f5c7ea09580532579950988567b3342f502/merged/var/lib/ceph/mgr/ceph-compute-2.jtkauz supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:11 np0005532763 podman[76043]: 2025-11-23 20:41:11.58049985 +0000 UTC m=+0.148562464 container init 21c1b17ca8177b004aadf1c5f4484b29d858350a608be5758ce46c5c1cc43272 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:41:11 np0005532763 podman[76043]: 2025-11-23 20:41:11.593466386 +0000 UTC m=+0.161528970 container start 21c1b17ca8177b004aadf1c5f4484b29d858350a608be5758ce46c5c1cc43272 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Nov 23 15:41:11 np0005532763 bash[76043]: 21c1b17ca8177b004aadf1c5f4484b29d858350a608be5758ce46c5c1cc43272
Nov 23 15:41:11 np0005532763 systemd[1]: Started Ceph mgr.compute-2.jtkauz for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:41:11 np0005532763 ceph-mon[75752]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 23 15:41:11 np0005532763 ceph-mon[75752]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 23 15:41:12 np0005532763 ceph-mon[75752]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 23 15:41:13 np0005532763 ceph-mon[75752]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 23 15:41:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 23 15:41:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 23 15:41:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Nov 23 15:41:15 np0005532763 ceph-mon[75752]: paxos.1).electionLogic(11) init, last seen epoch 11, mid-election, bumping
Nov 23 15:41:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 15:41:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 15:41:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Nov 23 15:41:16 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Nov 23 15:41:16 np0005532763 ceph-mgr[76063]: set uid:gid to 167:167 (ceph:ceph)
Nov 23 15:41:16 np0005532763 ceph-mgr[76063]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Nov 23 15:41:16 np0005532763 ceph-mgr[76063]: pidfile_write: ignore empty --pid-file
Nov 23 15:41:16 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Nov 23 15:41:16 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'alerts'
Nov 23 15:41:16 np0005532763 ceph-mgr[76063]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 23 15:41:16 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'balancer'
Nov 23 15:41:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:16.239+0000 7f26cbf0b140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 23 15:41:16 np0005532763 ceph-mgr[76063]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 23 15:41:16 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'cephadm'
Nov 23 15:41:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:16.316+0000 7f26cbf0b140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 23 15:41:16 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Nov 23 15:41:17 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'crash'
Nov 23 15:41:17 np0005532763 ceph-mgr[76063]: mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 23 15:41:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:17.141+0000 7f26cbf0b140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 23 15:41:17 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'dashboard'
Nov 23 15:41:17 np0005532763 ceph-mon[75752]: Deploying daemon mgr.compute-2.jtkauz on compute-2
Nov 23 15:41:17 np0005532763 ceph-mon[75752]: mon.compute-0 calling monitor election
Nov 23 15:41:17 np0005532763 ceph-mon[75752]: mon.compute-2 calling monitor election
Nov 23 15:41:17 np0005532763 ceph-mon[75752]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Nov 23 15:41:17 np0005532763 ceph-mon[75752]: Health detail: HEALTH_WARN 4 pool(s) do not have an application enabled
Nov 23 15:41:17 np0005532763 ceph-mon[75752]: [WRN] POOL_APP_NOT_ENABLED: 4 pool(s) do not have an application enabled
Nov 23 15:41:17 np0005532763 ceph-mon[75752]:    application not enabled on pool 'vms'
Nov 23 15:41:17 np0005532763 ceph-mon[75752]:    application not enabled on pool 'volumes'
Nov 23 15:41:17 np0005532763 ceph-mon[75752]:    application not enabled on pool 'backups'
Nov 23 15:41:17 np0005532763 ceph-mon[75752]:    application not enabled on pool 'images'
Nov 23 15:41:17 np0005532763 ceph-mon[75752]:    use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Nov 23 15:41:17 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'devicehealth'
Nov 23 15:41:17 np0005532763 ceph-mgr[76063]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 23 15:41:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:17.765+0000 7f26cbf0b140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 23 15:41:17 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'diskprediction_local'
Nov 23 15:41:17 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e18 e18: 2 total, 2 up, 2 in
Nov 23 15:41:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Nov 23 15:41:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Nov 23 15:41:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]:  from numpy import show_config as show_numpy_config
Nov 23 15:41:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:17.921+0000 7f26cbf0b140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 23 15:41:17 np0005532763 ceph-mgr[76063]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 23 15:41:17 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'influx'
Nov 23 15:41:17 np0005532763 ceph-mgr[76063]: mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 23 15:41:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:17.987+0000 7f26cbf0b140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 23 15:41:17 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'insights'
Nov 23 15:41:18 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'iostat'
Nov 23 15:41:18 np0005532763 ceph-mgr[76063]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 23 15:41:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:18.118+0000 7f26cbf0b140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 23 15:41:18 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'k8sevents'
Nov 23 15:41:18 np0005532763 ceph-mon[75752]: mon.compute-1 calling monitor election
Nov 23 15:41:18 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:18 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/2361136095' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 23 15:41:18 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:18 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:18 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Nov 23 15:41:18 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/2361136095' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 23 15:41:18 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'localpool'
Nov 23 15:41:18 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'mds_autoscaler'
Nov 23 15:41:18 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'mirroring'
Nov 23 15:41:18 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e19 e19: 2 total, 2 up, 2 in
Nov 23 15:41:18 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'nfs'
Nov 23 15:41:19 np0005532763 ceph-mgr[76063]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 23 15:41:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:19.109+0000 7f26cbf0b140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 23 15:41:19 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'orchestrator'
Nov 23 15:41:19 np0005532763 ceph-mgr[76063]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 23 15:41:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:19.317+0000 7f26cbf0b140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 23 15:41:19 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'osd_perf_query'
Nov 23 15:41:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e19 _set_new_cache_sizes cache_size:1019935940 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:41:19 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:19 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.kgyerp", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Nov 23 15:41:19 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.kgyerp", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Nov 23 15:41:19 np0005532763 ceph-mon[75752]: Deploying daemon mgr.compute-1.kgyerp on compute-1
Nov 23 15:41:19 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Nov 23 15:41:19 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Nov 23 15:41:19 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 23 15:41:19 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/3743302872' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Nov 23 15:41:19 np0005532763 ceph-mgr[76063]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 23 15:41:19 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'osd_support'
Nov 23 15:41:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:19.391+0000 7f26cbf0b140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 23 15:41:19 np0005532763 ceph-mgr[76063]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 23 15:41:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:19.457+0000 7f26cbf0b140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 23 15:41:19 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'pg_autoscaler'
Nov 23 15:41:19 np0005532763 ceph-mgr[76063]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 23 15:41:19 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'progress'
Nov 23 15:41:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:19.539+0000 7f26cbf0b140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 23 15:41:19 np0005532763 ceph-mgr[76063]: mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 23 15:41:19 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'prometheus'
Nov 23 15:41:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:19.611+0000 7f26cbf0b140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 23 15:41:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e20 e20: 2 total, 2 up, 2 in
Nov 23 15:41:19 np0005532763 ceph-mgr[76063]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 23 15:41:19 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'rbd_support'
Nov 23 15:41:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:19.961+0000 7f26cbf0b140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 23 15:41:20 np0005532763 ceph-mgr[76063]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 23 15:41:20 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'restful'
Nov 23 15:41:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:20.063+0000 7f26cbf0b140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 23 15:41:20 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'rgw'
Nov 23 15:41:20 np0005532763 ceph-mon[75752]: Health check update: 5 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 23 15:41:20 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Nov 23 15:41:20 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Nov 23 15:41:20 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/3743302872' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Nov 23 15:41:20 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Nov 23 15:41:20 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:20 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:20 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:20 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:20 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Nov 23 15:41:20 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Nov 23 15:41:20 np0005532763 ceph-mgr[76063]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 23 15:41:20 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'rook'
Nov 23 15:41:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:20.498+0000 7f26cbf0b140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 23 15:41:20 np0005532763 podman[76186]: 2025-11-23 20:41:20.728835304 +0000 UTC m=+0.068788251 container create 79e6053afd44b404370035431a46548fc36b06c9c3ac1909f4689a3d838db8a4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_hofstadter, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=squid)
Nov 23 15:41:20 np0005532763 systemd[1]: Started libpod-conmon-79e6053afd44b404370035431a46548fc36b06c9c3ac1909f4689a3d838db8a4.scope.
Nov 23 15:41:20 np0005532763 podman[76186]: 2025-11-23 20:41:20.699810678 +0000 UTC m=+0.039763635 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:20 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:41:20 np0005532763 podman[76186]: 2025-11-23 20:41:20.832192616 +0000 UTC m=+0.172145593 container init 79e6053afd44b404370035431a46548fc36b06c9c3ac1909f4689a3d838db8a4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_hofstadter, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 23 15:41:20 np0005532763 podman[76186]: 2025-11-23 20:41:20.845627909 +0000 UTC m=+0.185580846 container start 79e6053afd44b404370035431a46548fc36b06c9c3ac1909f4689a3d838db8a4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_hofstadter, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:41:20 np0005532763 podman[76186]: 2025-11-23 20:41:20.851204684 +0000 UTC m=+0.191157621 container attach 79e6053afd44b404370035431a46548fc36b06c9c3ac1909f4689a3d838db8a4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_hofstadter, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:41:20 np0005532763 dreamy_hofstadter[76202]: 167 167
Nov 23 15:41:20 np0005532763 systemd[1]: libpod-79e6053afd44b404370035431a46548fc36b06c9c3ac1909f4689a3d838db8a4.scope: Deactivated successfully.
Nov 23 15:41:20 np0005532763 podman[76186]: 2025-11-23 20:41:20.856584143 +0000 UTC m=+0.196537090 container died 79e6053afd44b404370035431a46548fc36b06c9c3ac1909f4689a3d838db8a4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_hofstadter, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 23 15:41:20 np0005532763 systemd[1]: var-lib-containers-storage-overlay-f7044b9bcd6eebcec3f116c00c7901692323d8f9265ac726dddfca38a4cc3120-merged.mount: Deactivated successfully.
Nov 23 15:41:20 np0005532763 podman[76186]: 2025-11-23 20:41:20.914172073 +0000 UTC m=+0.254125010 container remove 79e6053afd44b404370035431a46548fc36b06c9c3ac1909f4689a3d838db8a4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_hofstadter, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Nov 23 15:41:20 np0005532763 systemd[1]: libpod-conmon-79e6053afd44b404370035431a46548fc36b06c9c3ac1909f4689a3d838db8a4.scope: Deactivated successfully.
Nov 23 15:41:20 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e21 e21: 2 total, 2 up, 2 in
Nov 23 15:41:20 np0005532763 systemd[1]: Reloading.
Nov 23 15:41:21 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:41:21 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:41:21 np0005532763 ceph-mgr[76063]: mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 23 15:41:21 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'selftest'
Nov 23 15:41:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:21.099+0000 7f26cbf0b140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 23 15:41:21 np0005532763 ceph-mgr[76063]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 23 15:41:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:21.199+0000 7f26cbf0b140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 23 15:41:21 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'snap_schedule'
Nov 23 15:41:21 np0005532763 ceph-mgr[76063]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 23 15:41:21 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'stats'
Nov 23 15:41:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:21.294+0000 7f26cbf0b140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 23 15:41:21 np0005532763 systemd[1]: Reloading.
Nov 23 15:41:21 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'status'
Nov 23 15:41:21 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:41:21 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:41:21 np0005532763 ceph-mon[75752]: Deploying daemon crash.compute-2 on compute-2
Nov 23 15:41:21 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/39405231' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Nov 23 15:41:21 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Nov 23 15:41:21 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/39405231' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Nov 23 15:41:21 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Nov 23 15:41:21 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 23 15:41:21 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 23 15:41:21 np0005532763 ceph-mgr[76063]: mgr[py] Module status has missing NOTIFY_TYPES member
Nov 23 15:41:21 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'telegraf'
Nov 23 15:41:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:21.470+0000 7f26cbf0b140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Nov 23 15:41:21 np0005532763 ceph-mgr[76063]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 23 15:41:21 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'telemetry'
Nov 23 15:41:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:21.544+0000 7f26cbf0b140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 23 15:41:21 np0005532763 systemd[1]: Starting Ceph crash.compute-2 for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:41:21 np0005532763 ceph-mgr[76063]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 23 15:41:21 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'test_orchestrator'
Nov 23 15:41:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:21.722+0000 7f26cbf0b140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 23 15:41:21 np0005532763 ceph-mgr[76063]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 23 15:41:21 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'volumes'
Nov 23 15:41:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:21.955+0000 7f26cbf0b140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 23 15:41:21 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e22 e22: 2 total, 2 up, 2 in
Nov 23 15:41:21 np0005532763 podman[76339]: 2025-11-23 20:41:21.974057624 +0000 UTC m=+0.077974607 container create 4ad194abaacb94b1e1a40a7317667e5342e7ea8de383e078d9e083bf5cc1aa42 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-crash-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Nov 23 15:41:22 np0005532763 podman[76339]: 2025-11-23 20:41:21.943042112 +0000 UTC m=+0.046959155 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:22 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d55b7c9c88f78c0838444f9b00020b54eef7b344e1de91752faa2b6c7ec47746/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:22 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d55b7c9c88f78c0838444f9b00020b54eef7b344e1de91752faa2b6c7ec47746/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:22 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d55b7c9c88f78c0838444f9b00020b54eef7b344e1de91752faa2b6c7ec47746/merged/etc/ceph/ceph.client.crash.compute-2.keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:22 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d55b7c9c88f78c0838444f9b00020b54eef7b344e1de91752faa2b6c7ec47746/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:22 np0005532763 podman[76339]: 2025-11-23 20:41:22.051820964 +0000 UTC m=+0.155738007 container init 4ad194abaacb94b1e1a40a7317667e5342e7ea8de383e078d9e083bf5cc1aa42 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-crash-compute-2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1)
Nov 23 15:41:22 np0005532763 podman[76339]: 2025-11-23 20:41:22.062048818 +0000 UTC m=+0.165965811 container start 4ad194abaacb94b1e1a40a7317667e5342e7ea8de383e078d9e083bf5cc1aa42 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-crash-compute-2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:41:22 np0005532763 bash[76339]: 4ad194abaacb94b1e1a40a7317667e5342e7ea8de383e078d9e083bf5cc1aa42
Nov 23 15:41:22 np0005532763 systemd[1]: Started Ceph crash.compute-2 for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:41:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-crash-compute-2[76355]: INFO:ceph-crash:pinging cluster to exercise our key
Nov 23 15:41:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:22.228+0000 7f26cbf0b140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 23 15:41:22 np0005532763 ceph-mgr[76063]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 23 15:41:22 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'zabbix'
Nov 23 15:41:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-crash-compute-2[76355]: 2025-11-23T20:41:22.247+0000 7faece4f4640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Nov 23 15:41:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-crash-compute-2[76355]: 2025-11-23T20:41:22.247+0000 7faece4f4640 -1 AuthRegistry(0x7faec8069b10) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Nov 23 15:41:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-crash-compute-2[76355]: 2025-11-23T20:41:22.249+0000 7faece4f4640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Nov 23 15:41:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-crash-compute-2[76355]: 2025-11-23T20:41:22.249+0000 7faece4f4640 -1 AuthRegistry(0x7faece4f2ff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Nov 23 15:41:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-crash-compute-2[76355]: 2025-11-23T20:41:22.250+0000 7faecd4f2640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 23 15:41:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-crash-compute-2[76355]: 2025-11-23T20:41:22.251+0000 7faecdcf3640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 23 15:41:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-crash-compute-2[76355]: 2025-11-23T20:41:22.251+0000 7faecccf1640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Nov 23 15:41:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-crash-compute-2[76355]: 2025-11-23T20:41:22.251+0000 7faece4f4640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Nov 23 15:41:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-crash-compute-2[76355]: [errno 13] RADOS permission denied (error connecting to the cluster)
Nov 23 15:41:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-crash-compute-2[76355]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Nov 23 15:41:22 np0005532763 ceph-mgr[76063]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 23 15:41:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:22.300+0000 7f26cbf0b140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 23 15:41:22 np0005532763 ceph-mgr[76063]: ms_deliver_dispatch: unhandled message 0x5565bad10d00 mon_map magic: 0 from mon.0 v2:192.168.122.100:3300/0
Nov 23 15:41:22 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Nov 23 15:41:22 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Nov 23 15:41:22 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Nov 23 15:41:22 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1243267938' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Nov 23 15:41:22 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:22 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:22 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:22 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:22 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:41:22 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:41:22 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:22 np0005532763 podman[76463]: 2025-11-23 20:41:22.857810612 +0000 UTC m=+0.060013658 container create 4e6fb4ad542dcac5a7125e991d0a9caef51e7248e5bf51e01ed6ab93815be663 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_sinoussi, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0)
Nov 23 15:41:22 np0005532763 systemd[1]: Started libpod-conmon-4e6fb4ad542dcac5a7125e991d0a9caef51e7248e5bf51e01ed6ab93815be663.scope.
Nov 23 15:41:22 np0005532763 podman[76463]: 2025-11-23 20:41:22.827322815 +0000 UTC m=+0.029525921 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:22 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:41:22 np0005532763 podman[76463]: 2025-11-23 20:41:22.963798946 +0000 UTC m=+0.166002002 container init 4e6fb4ad542dcac5a7125e991d0a9caef51e7248e5bf51e01ed6ab93815be663 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_sinoussi, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Nov 23 15:41:22 np0005532763 podman[76463]: 2025-11-23 20:41:22.975187193 +0000 UTC m=+0.177390239 container start 4e6fb4ad542dcac5a7125e991d0a9caef51e7248e5bf51e01ed6ab93815be663 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_sinoussi, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325)
Nov 23 15:41:22 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e23 e23: 2 total, 2 up, 2 in
Nov 23 15:41:22 np0005532763 podman[76463]: 2025-11-23 20:41:22.982075224 +0000 UTC m=+0.184278330 container attach 4e6fb4ad542dcac5a7125e991d0a9caef51e7248e5bf51e01ed6ab93815be663 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_sinoussi, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Nov 23 15:41:22 np0005532763 clever_sinoussi[76479]: 167 167
Nov 23 15:41:22 np0005532763 systemd[1]: libpod-4e6fb4ad542dcac5a7125e991d0a9caef51e7248e5bf51e01ed6ab93815be663.scope: Deactivated successfully.
Nov 23 15:41:22 np0005532763 podman[76463]: 2025-11-23 20:41:22.985705705 +0000 UTC m=+0.187908761 container died 4e6fb4ad542dcac5a7125e991d0a9caef51e7248e5bf51e01ed6ab93815be663 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_sinoussi, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:41:23 np0005532763 systemd[1]: var-lib-containers-storage-overlay-376f12de73cc9dff0e9d6bf8641d4690fc9b2f692abddf3fd92d579839c5b357-merged.mount: Deactivated successfully.
Nov 23 15:41:23 np0005532763 podman[76463]: 2025-11-23 20:41:23.033987786 +0000 UTC m=+0.236190842 container remove 4e6fb4ad542dcac5a7125e991d0a9caef51e7248e5bf51e01ed6ab93815be663 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_sinoussi, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 23 15:41:23 np0005532763 systemd[1]: libpod-conmon-4e6fb4ad542dcac5a7125e991d0a9caef51e7248e5bf51e01ed6ab93815be663.scope: Deactivated successfully.
Nov 23 15:41:23 np0005532763 podman[76502]: 2025-11-23 20:41:23.287096857 +0000 UTC m=+0.075075106 container create 2d2425666546f1a0e4f8d53ad1ff55f8f51d3d09f4a82f20cb0613e20f2f08cb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_dhawan, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 15:41:23 np0005532763 systemd[1]: Started libpod-conmon-2d2425666546f1a0e4f8d53ad1ff55f8f51d3d09f4a82f20cb0613e20f2f08cb.scope.
Nov 23 15:41:23 np0005532763 podman[76502]: 2025-11-23 20:41:23.251981391 +0000 UTC m=+0.039959690 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:23 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:41:23 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24259c81892eb149a79bb93722ac6e0b3201fb478fed42a5e92948c324276313/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:23 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24259c81892eb149a79bb93722ac6e0b3201fb478fed42a5e92948c324276313/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:23 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24259c81892eb149a79bb93722ac6e0b3201fb478fed42a5e92948c324276313/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:23 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24259c81892eb149a79bb93722ac6e0b3201fb478fed42a5e92948c324276313/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:23 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24259c81892eb149a79bb93722ac6e0b3201fb478fed42a5e92948c324276313/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:23 np0005532763 podman[76502]: 2025-11-23 20:41:23.405907327 +0000 UTC m=+0.193885576 container init 2d2425666546f1a0e4f8d53ad1ff55f8f51d3d09f4a82f20cb0613e20f2f08cb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_dhawan, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:41:23 np0005532763 podman[76502]: 2025-11-23 20:41:23.4218518 +0000 UTC m=+0.209830059 container start 2d2425666546f1a0e4f8d53ad1ff55f8f51d3d09f4a82f20cb0613e20f2f08cb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_dhawan, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Nov 23 15:41:23 np0005532763 podman[76502]: 2025-11-23 20:41:23.42687451 +0000 UTC m=+0.214852759 container attach 2d2425666546f1a0e4f8d53ad1ff55f8f51d3d09f4a82f20cb0613e20f2f08cb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_dhawan, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Nov 23 15:41:23 np0005532763 affectionate_dhawan[76518]: --> passed data devices: 0 physical, 1 LVM
Nov 23 15:41:23 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 23 15:41:23 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e24 e24: 2 total, 2 up, 2 in
Nov 23 15:41:24 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 23 15:41:24 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1243267938' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Nov 23 15:41:24 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 23 15:41:24 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/2261115406' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Nov 23 15:41:24 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 89316dd3-297e-4d1b-953e-7f2ac7cbe63c
Nov 23 15:41:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e24 _set_new_cache_sizes cache_size:1020053145 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:41:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd new", "uuid": "89316dd3-297e-4d1b-953e-7f2ac7cbe63c"} v 0)
Nov 23 15:41:24 np0005532763 ceph-mon[75752]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1014258786' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "89316dd3-297e-4d1b-953e-7f2ac7cbe63c"}]: dispatch
Nov 23 15:41:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e25 e25: 3 total, 2 up, 3 in
Nov 23 15:41:24 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Nov 23 15:41:24 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Nov 23 15:41:24 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 23 15:41:24 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:24 np0005532763 lvm[76579]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 23 15:41:24 np0005532763 lvm[76579]: VG ceph_vg0 finished
Nov 23 15:41:24 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Nov 23 15:41:25 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Nov 23 15:41:25 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/2261115406' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Nov 23 15:41:25 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.102:0/1014258786' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "89316dd3-297e-4d1b-953e-7f2ac7cbe63c"}]: dispatch
Nov 23 15:41:25 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "89316dd3-297e-4d1b-953e-7f2ac7cbe63c"}]: dispatch
Nov 23 15:41:25 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "89316dd3-297e-4d1b-953e-7f2ac7cbe63c"}]': finished
Nov 23 15:41:25 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/4110558162' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Nov 23 15:41:25 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon getmap"} v 0)
Nov 23 15:41:25 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3255099167' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Nov 23 15:41:25 np0005532763 affectionate_dhawan[76518]: stderr: got monmap epoch 3
Nov 23 15:41:25 np0005532763 affectionate_dhawan[76518]: --> Creating keyring file for osd.2
Nov 23 15:41:25 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Nov 23 15:41:25 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Nov 23 15:41:25 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid 89316dd3-297e-4d1b-953e-7f2ac7cbe63c --setuser ceph --setgroup ceph
Nov 23 15:41:25 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e26 e26: 3 total, 2 up, 3 in
Nov 23 15:41:26 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/4110558162' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Nov 23 15:41:27 np0005532763 ceph-mon[75752]: Health check update: 3 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 23 15:41:27 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/54502927' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Nov 23 15:41:27 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:27 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:27 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e27 e27: 3 total, 2 up, 3 in
Nov 23 15:41:28 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/54502927' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Nov 23 15:41:28 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/330844918' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Nov 23 15:41:28 np0005532763 affectionate_dhawan[76518]: stderr: 2025-11-23T20:41:25.269+0000 7f12c04ec740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) No valid bdev label found
Nov 23 15:41:28 np0005532763 affectionate_dhawan[76518]: stderr: 2025-11-23T20:41:25.531+0000 7f12c04ec740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Nov 23 15:41:28 np0005532763 affectionate_dhawan[76518]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Nov 23 15:41:28 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 23 15:41:28 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Nov 23 15:41:28 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e28 e28: 3 total, 2 up, 3 in
Nov 23 15:41:29 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:29 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:29 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 23 15:41:29 np0005532763 affectionate_dhawan[76518]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 23 15:41:29 np0005532763 affectionate_dhawan[76518]: --> ceph-volume lvm activate successful for osd ID: 2
Nov 23 15:41:29 np0005532763 affectionate_dhawan[76518]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Nov 23 15:41:29 np0005532763 systemd[1]: libpod-2d2425666546f1a0e4f8d53ad1ff55f8f51d3d09f4a82f20cb0613e20f2f08cb.scope: Deactivated successfully.
Nov 23 15:41:29 np0005532763 systemd[1]: libpod-2d2425666546f1a0e4f8d53ad1ff55f8f51d3d09f4a82f20cb0613e20f2f08cb.scope: Consumed 2.544s CPU time.
Nov 23 15:41:29 np0005532763 podman[76502]: 2025-11-23 20:41:29.298812796 +0000 UTC m=+6.086791045 container died 2d2425666546f1a0e4f8d53ad1ff55f8f51d3d09f4a82f20cb0613e20f2f08cb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_dhawan, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:41:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e28 _set_new_cache_sizes cache_size:1020054710 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:41:29 np0005532763 systemd[1]: var-lib-containers-storage-overlay-24259c81892eb149a79bb93722ac6e0b3201fb478fed42a5e92948c324276313-merged.mount: Deactivated successfully.
Nov 23 15:41:29 np0005532763 podman[76502]: 2025-11-23 20:41:29.362451234 +0000 UTC m=+6.150429483 container remove 2d2425666546f1a0e4f8d53ad1ff55f8f51d3d09f4a82f20cb0613e20f2f08cb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_dhawan, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 15:41:29 np0005532763 systemd[1]: libpod-conmon-2d2425666546f1a0e4f8d53ad1ff55f8f51d3d09f4a82f20cb0613e20f2f08cb.scope: Deactivated successfully.
Nov 23 15:41:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e29 e29: 3 total, 2 up, 3 in
Nov 23 15:41:29 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/330844918' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Nov 23 15:41:29 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 23 15:41:29 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 23 15:41:29 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 23 15:41:29 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 23 15:41:30 np0005532763 podman[77608]: 2025-11-23 20:41:30.110596345 +0000 UTC m=+0.064476422 container create b5d2ca2e9d6541b7b57d1ebf63cbcf5f952f943cd5cf66cee7230c5384f93b97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_chandrasekhar, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Nov 23 15:41:30 np0005532763 systemd[1]: Started libpod-conmon-b5d2ca2e9d6541b7b57d1ebf63cbcf5f952f943cd5cf66cee7230c5384f93b97.scope.
Nov 23 15:41:30 np0005532763 podman[77608]: 2025-11-23 20:41:30.081037704 +0000 UTC m=+0.034917801 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:30 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:41:30 np0005532763 podman[77608]: 2025-11-23 20:41:30.212566088 +0000 UTC m=+0.166446175 container init b5d2ca2e9d6541b7b57d1ebf63cbcf5f952f943cd5cf66cee7230c5384f93b97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_chandrasekhar, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:41:30 np0005532763 podman[77608]: 2025-11-23 20:41:30.224941451 +0000 UTC m=+0.178821528 container start b5d2ca2e9d6541b7b57d1ebf63cbcf5f952f943cd5cf66cee7230c5384f93b97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_chandrasekhar, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:41:30 np0005532763 podman[77608]: 2025-11-23 20:41:30.229711524 +0000 UTC m=+0.183591601 container attach b5d2ca2e9d6541b7b57d1ebf63cbcf5f952f943cd5cf66cee7230c5384f93b97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_chandrasekhar, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:41:30 np0005532763 magical_chandrasekhar[77624]: 167 167
Nov 23 15:41:30 np0005532763 podman[77608]: 2025-11-23 20:41:30.23173265 +0000 UTC m=+0.185612747 container died b5d2ca2e9d6541b7b57d1ebf63cbcf5f952f943cd5cf66cee7230c5384f93b97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_chandrasekhar, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:41:30 np0005532763 systemd[1]: libpod-b5d2ca2e9d6541b7b57d1ebf63cbcf5f952f943cd5cf66cee7230c5384f93b97.scope: Deactivated successfully.
Nov 23 15:41:30 np0005532763 systemd[1]: var-lib-containers-storage-overlay-485993b5aa01eb0d3aea7ce306d4906da98a3b87db69f49f833d93acb052d9ee-merged.mount: Deactivated successfully.
Nov 23 15:41:30 np0005532763 podman[77608]: 2025-11-23 20:41:30.283336983 +0000 UTC m=+0.237217060 container remove b5d2ca2e9d6541b7b57d1ebf63cbcf5f952f943cd5cf66cee7230c5384f93b97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_chandrasekhar, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, ceph=True)
Nov 23 15:41:30 np0005532763 systemd[1]: libpod-conmon-b5d2ca2e9d6541b7b57d1ebf63cbcf5f952f943cd5cf66cee7230c5384f93b97.scope: Deactivated successfully.
Nov 23 15:41:30 np0005532763 podman[77647]: 2025-11-23 20:41:30.524367859 +0000 UTC m=+0.072765713 container create 63294ef9d1da4d8ca3d6613c78ce8549ffd09b8a740276bb338def391aaf8965 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_elbakyan, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 15:41:30 np0005532763 systemd[1]: Started libpod-conmon-63294ef9d1da4d8ca3d6613c78ce8549ffd09b8a740276bb338def391aaf8965.scope.
Nov 23 15:41:30 np0005532763 podman[77647]: 2025-11-23 20:41:30.497980046 +0000 UTC m=+0.046378000 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:30 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:41:30 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fff222fa2b4da12498d99ea07481a3dcdf7965e76f2d744b36ce44410ccdee8/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:30 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fff222fa2b4da12498d99ea07481a3dcdf7965e76f2d744b36ce44410ccdee8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:30 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fff222fa2b4da12498d99ea07481a3dcdf7965e76f2d744b36ce44410ccdee8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:30 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fff222fa2b4da12498d99ea07481a3dcdf7965e76f2d744b36ce44410ccdee8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:30 np0005532763 podman[77647]: 2025-11-23 20:41:30.623627345 +0000 UTC m=+0.172025309 container init 63294ef9d1da4d8ca3d6613c78ce8549ffd09b8a740276bb338def391aaf8965 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_elbakyan, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1)
Nov 23 15:41:30 np0005532763 podman[77647]: 2025-11-23 20:41:30.638250561 +0000 UTC m=+0.186648465 container start 63294ef9d1da4d8ca3d6613c78ce8549ffd09b8a740276bb338def391aaf8965 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_elbakyan, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:41:30 np0005532763 podman[77647]: 2025-11-23 20:41:30.643114026 +0000 UTC m=+0.191511890 container attach 63294ef9d1da4d8ca3d6613c78ce8549ffd09b8a740276bb338def391aaf8965 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_elbakyan, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True)
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]: {
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:    "2": [
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:        {
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:            "devices": [
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:                "/dev/loop3"
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:            ],
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:            "lv_name": "ceph_lv0",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:            "lv_size": "21470642176",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=alMEbq-XUaO-4x17-n2i3-zyAI-GPZG-esJK1S,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=03808be8-ae4a-5548-82e6-4a294f1bc627,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=89316dd3-297e-4d1b-953e-7f2ac7cbe63c,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:            "lv_uuid": "alMEbq-XUaO-4x17-n2i3-zyAI-GPZG-esJK1S",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:            "name": "ceph_lv0",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:            "path": "/dev/ceph_vg0/ceph_lv0",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:            "tags": {
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:                "ceph.block_uuid": "alMEbq-XUaO-4x17-n2i3-zyAI-GPZG-esJK1S",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:                "ceph.cephx_lockbox_secret": "",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:                "ceph.cluster_fsid": "03808be8-ae4a-5548-82e6-4a294f1bc627",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:                "ceph.cluster_name": "ceph",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:                "ceph.crush_device_class": "",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:                "ceph.encrypted": "0",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:                "ceph.osd_fsid": "89316dd3-297e-4d1b-953e-7f2ac7cbe63c",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:                "ceph.osd_id": "2",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:                "ceph.osdspec_affinity": "default_drive_group",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:                "ceph.type": "block",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:                "ceph.vdo": "0",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:                "ceph.with_tpm": "0"
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:            },
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:            "type": "block",
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:            "vg_name": "ceph_vg0"
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:        }
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]:    ]
Nov 23 15:41:30 np0005532763 pensive_elbakyan[77663]: }
Nov 23 15:41:30 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e30 e30: 3 total, 2 up, 3 in
Nov 23 15:41:30 np0005532763 systemd[1]: libpod-63294ef9d1da4d8ca3d6613c78ce8549ffd09b8a740276bb338def391aaf8965.scope: Deactivated successfully.
Nov 23 15:41:30 np0005532763 ceph-mon[75752]: Health check cleared: POOL_APP_NOT_ENABLED (was: 2 pool(s) do not have an application enabled)
Nov 23 15:41:30 np0005532763 ceph-mon[75752]: Cluster is now healthy
Nov 23 15:41:30 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 23 15:41:30 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 23 15:41:30 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 23 15:41:30 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 23 15:41:31 np0005532763 podman[77672]: 2025-11-23 20:41:31.037415089 +0000 UTC m=+0.047170471 container died 63294ef9d1da4d8ca3d6613c78ce8549ffd09b8a740276bb338def391aaf8965 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_elbakyan, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Nov 23 15:41:31 np0005532763 systemd[1]: var-lib-containers-storage-overlay-1fff222fa2b4da12498d99ea07481a3dcdf7965e76f2d744b36ce44410ccdee8-merged.mount: Deactivated successfully.
Nov 23 15:41:31 np0005532763 podman[77672]: 2025-11-23 20:41:31.090413011 +0000 UTC m=+0.100168393 container remove 63294ef9d1da4d8ca3d6613c78ce8549ffd09b8a740276bb338def391aaf8965 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_elbakyan, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 15:41:31 np0005532763 systemd[1]: libpod-conmon-63294ef9d1da4d8ca3d6613c78ce8549ffd09b8a740276bb338def391aaf8965.scope: Deactivated successfully.
Nov 23 15:41:31 np0005532763 podman[77778]: 2025-11-23 20:41:31.84313539 +0000 UTC m=+0.060065030 container create 62526c8871124e6e75921c3f5b32d6c18780dc5141740b97a04adcd798d7d481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_bhaskara, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0)
Nov 23 15:41:31 np0005532763 systemd[1]: Started libpod-conmon-62526c8871124e6e75921c3f5b32d6c18780dc5141740b97a04adcd798d7d481.scope.
Nov 23 15:41:31 np0005532763 podman[77778]: 2025-11-23 20:41:31.828195405 +0000 UTC m=+0.045125015 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:31 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:41:31 np0005532763 podman[77778]: 2025-11-23 20:41:31.944071124 +0000 UTC m=+0.161000784 container init 62526c8871124e6e75921c3f5b32d6c18780dc5141740b97a04adcd798d7d481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_bhaskara, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Nov 23 15:41:31 np0005532763 podman[77778]: 2025-11-23 20:41:31.954328509 +0000 UTC m=+0.171258159 container start 62526c8871124e6e75921c3f5b32d6c18780dc5141740b97a04adcd798d7d481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_bhaskara, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:41:31 np0005532763 podman[77778]: 2025-11-23 20:41:31.958308509 +0000 UTC m=+0.175238159 container attach 62526c8871124e6e75921c3f5b32d6c18780dc5141740b97a04adcd798d7d481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_bhaskara, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:41:31 np0005532763 nostalgic_bhaskara[77794]: 167 167
Nov 23 15:41:31 np0005532763 systemd[1]: libpod-62526c8871124e6e75921c3f5b32d6c18780dc5141740b97a04adcd798d7d481.scope: Deactivated successfully.
Nov 23 15:41:31 np0005532763 podman[77778]: 2025-11-23 20:41:31.962422413 +0000 UTC m=+0.179352053 container died 62526c8871124e6e75921c3f5b32d6c18780dc5141740b97a04adcd798d7d481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_bhaskara, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:41:31 np0005532763 systemd[1]: var-lib-containers-storage-overlay-a94738028b213a625662be32c66ec172ffe0acba96dc19bd27ca2445756e6aa2-merged.mount: Deactivated successfully.
Nov 23 15:41:32 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Nov 23 15:41:32 np0005532763 ceph-mon[75752]: Deploying daemon osd.2 on compute-2
Nov 23 15:41:32 np0005532763 podman[77778]: 2025-11-23 20:41:32.013560844 +0000 UTC m=+0.230490494 container remove 62526c8871124e6e75921c3f5b32d6c18780dc5141740b97a04adcd798d7d481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_bhaskara, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:41:32 np0005532763 systemd[1]: libpod-conmon-62526c8871124e6e75921c3f5b32d6c18780dc5141740b97a04adcd798d7d481.scope: Deactivated successfully.
Nov 23 15:41:32 np0005532763 podman[77823]: 2025-11-23 20:41:32.404489213 +0000 UTC m=+0.065031747 container create 71cfcdf048348341893e4650fe87392e0a041ca58f7d4de36e13fe8ad98f8203 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate-test, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Nov 23 15:41:32 np0005532763 systemd[1]: Started libpod-conmon-71cfcdf048348341893e4650fe87392e0a041ca58f7d4de36e13fe8ad98f8203.scope.
Nov 23 15:41:32 np0005532763 podman[77823]: 2025-11-23 20:41:32.375294322 +0000 UTC m=+0.035836896 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:32 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:41:32 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/371a237c78698850c598f0ba01d5dda26b9a60fae29b225bd95155a8b5341047/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:32 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/371a237c78698850c598f0ba01d5dda26b9a60fae29b225bd95155a8b5341047/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:32 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/371a237c78698850c598f0ba01d5dda26b9a60fae29b225bd95155a8b5341047/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:32 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/371a237c78698850c598f0ba01d5dda26b9a60fae29b225bd95155a8b5341047/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:32 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/371a237c78698850c598f0ba01d5dda26b9a60fae29b225bd95155a8b5341047/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:32 np0005532763 podman[77823]: 2025-11-23 20:41:32.510643862 +0000 UTC m=+0.171186446 container init 71cfcdf048348341893e4650fe87392e0a041ca58f7d4de36e13fe8ad98f8203 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate-test, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True)
Nov 23 15:41:32 np0005532763 podman[77823]: 2025-11-23 20:41:32.528286992 +0000 UTC m=+0.188829526 container start 71cfcdf048348341893e4650fe87392e0a041ca58f7d4de36e13fe8ad98f8203 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate-test, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:41:32 np0005532763 podman[77823]: 2025-11-23 20:41:32.532958461 +0000 UTC m=+0.193501006 container attach 71cfcdf048348341893e4650fe87392e0a041ca58f7d4de36e13fe8ad98f8203 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate-test, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:41:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate-test[77839]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Nov 23 15:41:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate-test[77839]:                            [--no-systemd] [--no-tmpfs]
Nov 23 15:41:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate-test[77839]: ceph-volume activate: error: unrecognized arguments: --bad-option
Nov 23 15:41:32 np0005532763 systemd[1]: libpod-71cfcdf048348341893e4650fe87392e0a041ca58f7d4de36e13fe8ad98f8203.scope: Deactivated successfully.
Nov 23 15:41:32 np0005532763 podman[77823]: 2025-11-23 20:41:32.75575344 +0000 UTC m=+0.416295974 container died 71cfcdf048348341893e4650fe87392e0a041ca58f7d4de36e13fe8ad98f8203 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate-test, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Nov 23 15:41:32 np0005532763 systemd[1]: var-lib-containers-storage-overlay-371a237c78698850c598f0ba01d5dda26b9a60fae29b225bd95155a8b5341047-merged.mount: Deactivated successfully.
Nov 23 15:41:32 np0005532763 podman[77823]: 2025-11-23 20:41:32.800204625 +0000 UTC m=+0.460747149 container remove 71cfcdf048348341893e4650fe87392e0a041ca58f7d4de36e13fe8ad98f8203 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate-test, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Nov 23 15:41:32 np0005532763 systemd[1]: libpod-conmon-71cfcdf048348341893e4650fe87392e0a041ca58f7d4de36e13fe8ad98f8203.scope: Deactivated successfully.
Nov 23 15:41:33 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1120149195' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Nov 23 15:41:33 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1120149195' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Nov 23 15:41:33 np0005532763 systemd[1]: Reloading.
Nov 23 15:41:33 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:41:33 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:41:33 np0005532763 systemd[1]: Reloading.
Nov 23 15:41:33 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:41:33 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:41:33 np0005532763 systemd[1]: Starting Ceph osd.2 for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:41:34 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/475116719' entity='client.admin' 
Nov 23 15:41:34 np0005532763 podman[77999]: 2025-11-23 20:41:34.084424697 +0000 UTC m=+0.067286400 container create 08fa97da32317f1533a2f052ad5cf2ac318832ac71e639f2c2b8eecefe916a8b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Nov 23 15:41:34 np0005532763 podman[77999]: 2025-11-23 20:41:34.05752713 +0000 UTC m=+0.040388853 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:34 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:41:34 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463ffde8c61cabf5b9b2d1d5ad49744504712850b18024efc601dde5541aa942/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:34 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463ffde8c61cabf5b9b2d1d5ad49744504712850b18024efc601dde5541aa942/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:34 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463ffde8c61cabf5b9b2d1d5ad49744504712850b18024efc601dde5541aa942/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:34 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463ffde8c61cabf5b9b2d1d5ad49744504712850b18024efc601dde5541aa942/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:34 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/463ffde8c61cabf5b9b2d1d5ad49744504712850b18024efc601dde5541aa942/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:34 np0005532763 podman[77999]: 2025-11-23 20:41:34.180243429 +0000 UTC m=+0.163105172 container init 08fa97da32317f1533a2f052ad5cf2ac318832ac71e639f2c2b8eecefe916a8b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:41:34 np0005532763 podman[77999]: 2025-11-23 20:41:34.192254272 +0000 UTC m=+0.175115975 container start 08fa97da32317f1533a2f052ad5cf2ac318832ac71e639f2c2b8eecefe916a8b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 15:41:34 np0005532763 podman[77999]: 2025-11-23 20:41:34.196399937 +0000 UTC m=+0.179261690 container attach 08fa97da32317f1533a2f052ad5cf2ac318832ac71e639f2c2b8eecefe916a8b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 23 15:41:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e30 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:41:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate[78015]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 23 15:41:34 np0005532763 bash[77999]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 23 15:41:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate[78015]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 23 15:41:34 np0005532763 bash[77999]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 23 15:41:35 np0005532763 lvm[78096]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 23 15:41:35 np0005532763 lvm[78096]: VG ceph_vg0 finished
Nov 23 15:41:35 np0005532763 ceph-mon[75752]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Nov 23 15:41:35 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:35 np0005532763 ceph-mon[75752]: Saving service ingress.rgw.default spec with placement count:2
Nov 23 15:41:35 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate[78015]: --> Failed to activate via raw: did not find any matching OSD to activate
Nov 23 15:41:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate[78015]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 23 15:41:35 np0005532763 bash[77999]: --> Failed to activate via raw: did not find any matching OSD to activate
Nov 23 15:41:35 np0005532763 bash[77999]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 23 15:41:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate[78015]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 23 15:41:35 np0005532763 bash[77999]: Running command: /usr/bin/ceph-authtool --gen-print-key
Nov 23 15:41:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate[78015]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 23 15:41:35 np0005532763 bash[77999]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 23 15:41:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate[78015]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Nov 23 15:41:35 np0005532763 bash[77999]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Nov 23 15:41:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate[78015]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:35 np0005532763 bash[77999]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate[78015]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:35 np0005532763 bash[77999]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate[78015]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 23 15:41:35 np0005532763 bash[77999]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Nov 23 15:41:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate[78015]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 23 15:41:35 np0005532763 bash[77999]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Nov 23 15:41:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate[78015]: --> ceph-volume lvm activate successful for osd ID: 2
Nov 23 15:41:35 np0005532763 bash[77999]: --> ceph-volume lvm activate successful for osd ID: 2
Nov 23 15:41:35 np0005532763 systemd[1]: libpod-08fa97da32317f1533a2f052ad5cf2ac318832ac71e639f2c2b8eecefe916a8b.scope: Deactivated successfully.
Nov 23 15:41:35 np0005532763 podman[77999]: 2025-11-23 20:41:35.643876294 +0000 UTC m=+1.626738007 container died 08fa97da32317f1533a2f052ad5cf2ac318832ac71e639f2c2b8eecefe916a8b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Nov 23 15:41:35 np0005532763 systemd[1]: libpod-08fa97da32317f1533a2f052ad5cf2ac318832ac71e639f2c2b8eecefe916a8b.scope: Consumed 1.754s CPU time.
Nov 23 15:41:35 np0005532763 systemd[1]: var-lib-containers-storage-overlay-463ffde8c61cabf5b9b2d1d5ad49744504712850b18024efc601dde5541aa942-merged.mount: Deactivated successfully.
Nov 23 15:41:35 np0005532763 podman[77999]: 2025-11-23 20:41:35.707329177 +0000 UTC m=+1.690190860 container remove 08fa97da32317f1533a2f052ad5cf2ac318832ac71e639f2c2b8eecefe916a8b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2-activate, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Nov 23 15:41:36 np0005532763 podman[78249]: 2025-11-23 20:41:36.025063093 +0000 UTC m=+0.110305235 container create 33c6b3e6b30aaacf357d84f2f59189ef3b637b2ca4f9dbf4774178c8307f1a7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Nov 23 15:41:36 np0005532763 podman[78249]: 2025-11-23 20:41:35.958132143 +0000 UTC m=+0.043374335 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:36 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b27264e0eda7a6024786433b61cb0eb1227aecb4ae0ead8feae79ce373de13/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:36 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b27264e0eda7a6024786433b61cb0eb1227aecb4ae0ead8feae79ce373de13/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:36 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b27264e0eda7a6024786433b61cb0eb1227aecb4ae0ead8feae79ce373de13/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:36 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b27264e0eda7a6024786433b61cb0eb1227aecb4ae0ead8feae79ce373de13/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:36 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39b27264e0eda7a6024786433b61cb0eb1227aecb4ae0ead8feae79ce373de13/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:36 np0005532763 podman[78249]: 2025-11-23 20:41:36.123861527 +0000 UTC m=+0.209103719 container init 33c6b3e6b30aaacf357d84f2f59189ef3b637b2ca4f9dbf4774178c8307f1a7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Nov 23 15:41:36 np0005532763 podman[78249]: 2025-11-23 20:41:36.138521484 +0000 UTC m=+0.223763636 container start 33c6b3e6b30aaacf357d84f2f59189ef3b637b2ca4f9dbf4774178c8307f1a7c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:41:36 np0005532763 bash[78249]: 33c6b3e6b30aaacf357d84f2f59189ef3b637b2ca4f9dbf4774178c8307f1a7c
Nov 23 15:41:36 np0005532763 systemd[1]: Started Ceph osd.2 for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: set uid:gid to 167:167 (ceph:ceph)
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: pidfile_write: ignore empty --pid-file
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) close
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) close
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 15:41:36 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) close
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) close
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) close
Nov 23 15:41:37 np0005532763 podman[78379]: 2025-11-23 20:41:37.55784599 +0000 UTC m=+0.067480875 container create a52570f7130302d41914fe98363f19447fabb30b66657a76edb1d27b72154ad9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_bohr, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33800 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33800 /var/lib/ceph/osd/ceph-2/block) close
Nov 23 15:41:37 np0005532763 systemd[1]: Started libpod-conmon-a52570f7130302d41914fe98363f19447fabb30b66657a76edb1d27b72154ad9.scope.
Nov 23 15:41:37 np0005532763 podman[78379]: 2025-11-23 20:41:37.529060331 +0000 UTC m=+0.038695276 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:37 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:41:37 np0005532763 podman[78379]: 2025-11-23 20:41:37.658104375 +0000 UTC m=+0.167739320 container init a52570f7130302d41914fe98363f19447fabb30b66657a76edb1d27b72154ad9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_bohr, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:41:37 np0005532763 podman[78379]: 2025-11-23 20:41:37.668461743 +0000 UTC m=+0.178096658 container start a52570f7130302d41914fe98363f19447fabb30b66657a76edb1d27b72154ad9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_bohr, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Nov 23 15:41:37 np0005532763 podman[78379]: 2025-11-23 20:41:37.673444291 +0000 UTC m=+0.183079256 container attach a52570f7130302d41914fe98363f19447fabb30b66657a76edb1d27b72154ad9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_bohr, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True)
Nov 23 15:41:37 np0005532763 condescending_bohr[78398]: 167 167
Nov 23 15:41:37 np0005532763 systemd[1]: libpod-a52570f7130302d41914fe98363f19447fabb30b66657a76edb1d27b72154ad9.scope: Deactivated successfully.
Nov 23 15:41:37 np0005532763 podman[78379]: 2025-11-23 20:41:37.680333643 +0000 UTC m=+0.189968588 container died a52570f7130302d41914fe98363f19447fabb30b66657a76edb1d27b72154ad9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_bohr, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Nov 23 15:41:37 np0005532763 systemd[1]: var-lib-containers-storage-overlay-c03094d5b07a0a4d7becdbcd57228d797d35ebc4054776a5850d2de75329885e-merged.mount: Deactivated successfully.
Nov 23 15:41:37 np0005532763 podman[78379]: 2025-11-23 20:41:37.731913875 +0000 UTC m=+0.241548790 container remove a52570f7130302d41914fe98363f19447fabb30b66657a76edb1d27b72154ad9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_bohr, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:41:37 np0005532763 systemd[1]: libpod-conmon-a52570f7130302d41914fe98363f19447fabb30b66657a76edb1d27b72154ad9.scope: Deactivated successfully.
Nov 23 15:41:37 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:37 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:37 np0005532763 ceph-mon[75752]: Saving service node-exporter spec with placement *
Nov 23 15:41:37 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:37 np0005532763 ceph-mon[75752]: Saving service grafana spec with placement compute-0;count:1
Nov 23 15:41:37 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:37 np0005532763 ceph-mon[75752]: Saving service prometheus spec with placement compute-0;count:1
Nov 23 15:41:37 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:37 np0005532763 ceph-mon[75752]: Saving service alertmanager spec with placement compute-0;count:1
Nov 23 15:41:37 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:37 np0005532763 ceph-osd[78269]: bdev(0x557d78b33c00 /var/lib/ceph/osd/ceph-2/block) close
Nov 23 15:41:37 np0005532763 podman[78421]: 2025-11-23 20:41:37.962762867 +0000 UTC m=+0.057936111 container create 88f248a4982281ffb76ace3f1eb884a63dd28b684466fdec5621c25e6e00ad8e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_diffie, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:41:38 np0005532763 systemd[1]: Started libpod-conmon-88f248a4982281ffb76ace3f1eb884a63dd28b684466fdec5621c25e6e00ad8e.scope.
Nov 23 15:41:38 np0005532763 podman[78421]: 2025-11-23 20:41:37.935805378 +0000 UTC m=+0.030978642 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:38 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:41:38 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58fb5e74a2fff9f660b7ac1b093895fd7cdfd3b5702321225d84ca9ef66c3a7b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:38 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58fb5e74a2fff9f660b7ac1b093895fd7cdfd3b5702321225d84ca9ef66c3a7b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:38 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58fb5e74a2fff9f660b7ac1b093895fd7cdfd3b5702321225d84ca9ef66c3a7b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:38 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58fb5e74a2fff9f660b7ac1b093895fd7cdfd3b5702321225d84ca9ef66c3a7b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:38 np0005532763 podman[78421]: 2025-11-23 20:41:38.095871724 +0000 UTC m=+0.191044978 container init 88f248a4982281ffb76ace3f1eb884a63dd28b684466fdec5621c25e6e00ad8e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_diffie, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 23 15:41:38 np0005532763 podman[78421]: 2025-11-23 20:41:38.107175428 +0000 UTC m=+0.202348652 container start 88f248a4982281ffb76ace3f1eb884a63dd28b684466fdec5621c25e6e00ad8e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_diffie, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: load: jerasure load: lrc 
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) close
Nov 23 15:41:38 np0005532763 podman[78421]: 2025-11-23 20:41:38.222164582 +0000 UTC m=+0.317337906 container attach 88f248a4982281ffb76ace3f1eb884a63dd28b684466fdec5621c25e6e00ad8e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_diffie, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) close
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) close
Nov 23 15:41:38 np0005532763 lvm[78529]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 23 15:41:38 np0005532763 lvm[78529]: VG ceph_vg0 finished
Nov 23 15:41:38 np0005532763 zen_diffie[78437]: {}
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) close
Nov 23 15:41:38 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/209025710' entity='client.admin' 
Nov 23 15:41:38 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/3810457862' entity='client.admin' 
Nov 23 15:41:38 np0005532763 podman[78421]: 2025-11-23 20:41:38.969624185 +0000 UTC m=+1.064797429 container died 88f248a4982281ffb76ace3f1eb884a63dd28b684466fdec5621c25e6e00ad8e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_diffie, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Nov 23 15:41:38 np0005532763 systemd[1]: libpod-88f248a4982281ffb76ace3f1eb884a63dd28b684466fdec5621c25e6e00ad8e.scope: Deactivated successfully.
Nov 23 15:41:38 np0005532763 systemd[1]: libpod-88f248a4982281ffb76ace3f1eb884a63dd28b684466fdec5621c25e6e00ad8e.scope: Consumed 1.401s CPU time.
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 15:41:38 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) close
Nov 23 15:41:39 np0005532763 systemd[1]: var-lib-containers-storage-overlay-58fb5e74a2fff9f660b7ac1b093895fd7cdfd3b5702321225d84ca9ef66c3a7b-merged.mount: Deactivated successfully.
Nov 23 15:41:39 np0005532763 podman[78421]: 2025-11-23 20:41:39.033412187 +0000 UTC m=+1.128585401 container remove 88f248a4982281ffb76ace3f1eb884a63dd28b684466fdec5621c25e6e00ad8e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_diffie, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 23 15:41:39 np0005532763 systemd[1]: libpod-conmon-88f248a4982281ffb76ace3f1eb884a63dd28b684466fdec5621c25e6e00ad8e.scope: Deactivated successfully.
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bdev(0x557d799aac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bdev(0x557d799ab000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bdev(0x557d799ab000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bdev(0x557d799ab000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bdev(0x557d799ab000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs mount
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs mount shared_bdev_used = 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: RocksDB version: 7.9.2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Git sha 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Compile date 2025-07-17 03:12:14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: DB SUMMARY
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: DB Session ID:  R1RR8GMPOAUXX4OTDRYT
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: CURRENT file:  CURRENT
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: IDENTITY file:  IDENTITY
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                         Options.error_if_exists: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.create_if_missing: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                         Options.paranoid_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                                     Options.env: 0x557d78b86770
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                                Options.info_log: 0x557d799af9c0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_file_opening_threads: 16
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                              Options.statistics: (nil)
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.use_fsync: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.max_log_file_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                         Options.allow_fallocate: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.use_direct_reads: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.create_missing_column_families: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                              Options.db_log_dir: 
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                                 Options.wal_dir: db.wal
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.advise_random_on_open: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.write_buffer_manager: 0x557d79a92a00
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                            Options.rate_limiter: (nil)
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.unordered_write: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.row_cache: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                              Options.wal_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.allow_ingest_behind: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.two_write_queues: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.manual_wal_flush: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.wal_compression: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.atomic_flush: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.log_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.allow_data_in_errors: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.db_host_id: __hostname__
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.max_background_jobs: 4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.max_background_compactions: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.max_subcompactions: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.max_open_files: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.bytes_per_sync: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.max_background_flushes: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Compression algorithms supported:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: #011kZSTD supported: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: #011kXpressCompression supported: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: #011kBZip2Compression supported: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: #011kLZ4Compression supported: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: #011kZlibCompression supported: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: #011kLZ4HCCompression supported: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: #011kSnappyCompression supported: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799afd80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc9350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799afd80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc9350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799afd80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc9350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799afd80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc9350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799afd80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc9350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799afd80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc9350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799afd80)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc9350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799afda0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc89b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799afda0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc89b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799afda0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc89b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: ad9bb5d9-48d9-44b5-992c-51f2bb4e5485
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930499282367, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930499282776, "job": 1, "event": "recovery_finished"}
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: freelist init
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: freelist _read_cfg
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs umount
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bdev(0x557d799ab000 /var/lib/ceph/osd/ceph-2/block) close
Nov 23 15:41:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e30 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bdev(0x557d799ab000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bdev(0x557d799ab000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bdev(0x557d799ab000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bdev(0x557d799ab000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs mount
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluefs mount shared_bdev_used = 4718592
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: RocksDB version: 7.9.2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Git sha 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Compile date 2025-07-17 03:12:14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: DB SUMMARY
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: DB Session ID:  R1RR8GMPOAUXX4OTDRYS
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: CURRENT file:  CURRENT
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: IDENTITY file:  IDENTITY
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                         Options.error_if_exists: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.create_if_missing: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                         Options.paranoid_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.flush_verify_memtable_count: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                                     Options.env: 0x557d78b86d90
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                                      Options.fs: LegacyFileSystem
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                                Options.info_log: 0x557d799afb40
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_file_opening_threads: 16
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                              Options.statistics: (nil)
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.use_fsync: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.max_log_file_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.log_file_time_to_roll: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.keep_log_file_num: 1000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.recycle_log_file_num: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                         Options.allow_fallocate: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.allow_mmap_reads: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.allow_mmap_writes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.use_direct_reads: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.create_missing_column_families: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                              Options.db_log_dir: 
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                                 Options.wal_dir: db.wal
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.table_cache_numshardbits: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                         Options.WAL_ttl_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.WAL_size_limit_MB: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.manifest_preallocation_size: 4194304
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                     Options.is_fd_close_on_exec: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.advise_random_on_open: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.db_write_buffer_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.write_buffer_manager: 0x557d79a92a00
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.access_hint_on_compaction_start: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                      Options.use_adaptive_mutex: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                            Options.rate_limiter: (nil)
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.wal_recovery_mode: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.enable_thread_tracking: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.enable_pipelined_write: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.unordered_write: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.write_thread_max_yield_usec: 100
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.row_cache: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                              Options.wal_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.avoid_flush_during_recovery: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.allow_ingest_behind: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.two_write_queues: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.manual_wal_flush: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.wal_compression: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.atomic_flush: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.persist_stats_to_disk: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.write_dbid_to_manifest: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.log_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.best_efforts_recovery: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.allow_data_in_errors: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.db_host_id: __hostname__
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.enforce_single_del_contracts: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.max_background_jobs: 4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.max_background_compactions: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.max_subcompactions: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.writable_file_max_buffer_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.delayed_write_rate : 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.max_total_wal_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.stats_dump_period_sec: 600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.stats_persist_period_sec: 600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.max_open_files: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.bytes_per_sync: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                      Options.wal_bytes_per_sync: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.strict_bytes_per_sync: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.compaction_readahead_size: 2097152
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.max_background_flushes: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Compression algorithms supported:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: #011kZSTD supported: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: #011kXpressCompression supported: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: #011kBZip2Compression supported: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: #011kLZ4Compression supported: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: #011kZlibCompression supported: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: #011kLZ4HCCompression supported: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: #011kSnappyCompression supported: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Fast CRC32 supported: Supported on x86
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: DMutex implementation: pthread_mutex_t
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799af8a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc9350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799af8a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc9350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799af8a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc9350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799af8a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc9350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799af8a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc9350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799af8a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc9350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799af8a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc9350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799afce0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc89b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799afce0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc89b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:           Options.merge_operator: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.compaction_filter_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.sst_partitioner_factory: None
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.memtable_factory: SkipListFactory
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.table_factory: BlockBasedTable
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557d799afce0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x557d78bc89b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.write_buffer_size: 16777216
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.max_write_buffer_number: 64
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.compression: LZ4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression: Disabled
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.num_levels: 7
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:            Options.compression_opts.window_bits: -14
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.level: 32767
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.compression_opts.strategy: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.parallel_threads: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                  Options.compression_opts.enabled: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:              Options.level0_stop_writes_trigger: 36
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.target_file_size_base: 67108864
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:             Options.target_file_size_multiplier: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.arena_block_size: 1048576
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.disable_auto_compactions: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.inplace_update_support: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                 Options.inplace_update_num_locks: 10000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:               Options.memtable_whole_key_filtering: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:   Options.memtable_huge_page_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.bloom_locality: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                    Options.max_successive_merges: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.optimize_filters_for_hits: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.paranoid_file_checks: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.force_consistency_checks: 1
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.report_bg_io_stats: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                               Options.ttl: 2592000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.periodic_compaction_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:    Options.preserve_internal_time_seconds: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                       Options.enable_blob_files: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                           Options.min_blob_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                          Options.blob_file_size: 268435456
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                   Options.blob_compression_type: NoCompression
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.enable_blob_garbage_collection: false
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:          Options.blob_compaction_readahead_size: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb:                Options.blob_file_starting_level: 0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: ad9bb5d9-48d9-44b5-992c-51f2bb4e5485
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930499562190, "job": 1, "event": "recovery_started", "wal_files": [31]}
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930499566069, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930499, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ad9bb5d9-48d9-44b5-992c-51f2bb4e5485", "db_session_id": "R1RR8GMPOAUXX4OTDRYS", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930499568438, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930499, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ad9bb5d9-48d9-44b5-992c-51f2bb4e5485", "db_session_id": "R1RR8GMPOAUXX4OTDRYS", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930499570872, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930499, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ad9bb5d9-48d9-44b5-992c-51f2bb4e5485", "db_session_id": "R1RR8GMPOAUXX4OTDRYS", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930499572283, "job": 1, "event": "recovery_finished"}
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x557d79b54000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: DB pointer 0x557d79ce0000
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557d78bc9350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557d78bc9350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: _get_class not permitted to load lua
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: _get_class not permitted to load sdk
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: osd.2 0 load_pgs
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: osd.2 0 load_pgs opened 0 pgs
Nov 23 15:41:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2[78265]: 2025-11-23T20:41:39.605+0000 7f4b7ef91740 -1 osd.2 0 log_to_monitors true
Nov 23 15:41:39 np0005532763 ceph-osd[78269]: osd.2 0 log_to_monitors true
Nov 23 15:41:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0)
Nov 23 15:41:39 np0005532763 ceph-mon[75752]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/530987644,v1:192.168.122.102:6801/530987644]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Nov 23 15:41:39 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:39 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:39 np0005532763 ceph-mon[75752]: from='osd.2 [v2:192.168.122.102:6800/530987644,v1:192.168.122.102:6801/530987644]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Nov 23 15:41:39 np0005532763 ceph-mon[75752]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Nov 23 15:41:39 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1043924838' entity='client.admin' 
Nov 23 15:41:40 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e31 e31: 3 total, 2 up, 3 in
Nov 23 15:41:40 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]} v 0)
Nov 23 15:41:40 np0005532763 ceph-mon[75752]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/530987644,v1:192.168.122.102:6801/530987644]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Nov 23 15:41:40 np0005532763 podman[79096]: 2025-11-23 20:41:40.242747869 +0000 UTC m=+0.091894263 container exec 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Nov 23 15:41:40 np0005532763 podman[79096]: 2025-11-23 20:41:40.371550267 +0000 UTC m=+0.220696611 container exec_died 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Nov 23 15:41:40 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Nov 23 15:41:40 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Nov 23 15:41:40 np0005532763 ceph-mon[75752]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Nov 23 15:41:40 np0005532763 ceph-mon[75752]: from='osd.2 [v2:192.168.122.102:6800/530987644,v1:192.168.122.102:6801/530987644]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Nov 23 15:41:40 np0005532763 ceph-mon[75752]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Nov 23 15:41:40 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:40 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:40 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:40 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:41 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e32 e32: 3 total, 2 up, 3 in
Nov 23 15:41:41 np0005532763 ceph-osd[78269]: osd.2 0 done with init, starting boot process
Nov 23 15:41:41 np0005532763 ceph-osd[78269]: osd.2 0 start_boot
Nov 23 15:41:41 np0005532763 ceph-osd[78269]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Nov 23 15:41:41 np0005532763 ceph-osd[78269]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Nov 23 15:41:41 np0005532763 ceph-osd[78269]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Nov 23 15:41:41 np0005532763 ceph-osd[78269]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Nov 23 15:41:41 np0005532763 ceph-osd[78269]: osd.2 0  bench count 12288000 bsize 4 KiB
Nov 23 15:41:42 np0005532763 podman[79354]: 2025-11-23 20:41:42.289198874 +0000 UTC m=+0.031527367 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:42 np0005532763 ceph-mon[75752]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Nov 23 15:41:42 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1122996363' entity='client.admin' 
Nov 23 15:41:42 np0005532763 podman[79354]: 2025-11-23 20:41:42.605802239 +0000 UTC m=+0.348130722 container create a240ca06ed6aa4c22ff626c875447bdb3ed67060bae6e135828a99127c53462a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=optimistic_grothendieck, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Nov 23 15:41:42 np0005532763 systemd[1]: Started libpod-conmon-a240ca06ed6aa4c22ff626c875447bdb3ed67060bae6e135828a99127c53462a.scope.
Nov 23 15:41:42 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:41:42 np0005532763 podman[79354]: 2025-11-23 20:41:42.730168713 +0000 UTC m=+0.472497226 container init a240ca06ed6aa4c22ff626c875447bdb3ed67060bae6e135828a99127c53462a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=optimistic_grothendieck, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:41:42 np0005532763 podman[79354]: 2025-11-23 20:41:42.740755437 +0000 UTC m=+0.483083920 container start a240ca06ed6aa4c22ff626c875447bdb3ed67060bae6e135828a99127c53462a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=optimistic_grothendieck, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:41:42 np0005532763 podman[79354]: 2025-11-23 20:41:42.74591292 +0000 UTC m=+0.488241403 container attach a240ca06ed6aa4c22ff626c875447bdb3ed67060bae6e135828a99127c53462a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=optimistic_grothendieck, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:41:42 np0005532763 optimistic_grothendieck[79370]: 167 167
Nov 23 15:41:42 np0005532763 systemd[1]: libpod-a240ca06ed6aa4c22ff626c875447bdb3ed67060bae6e135828a99127c53462a.scope: Deactivated successfully.
Nov 23 15:41:42 np0005532763 podman[79354]: 2025-11-23 20:41:42.748868812 +0000 UTC m=+0.491197295 container died a240ca06ed6aa4c22ff626c875447bdb3ed67060bae6e135828a99127c53462a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=optimistic_grothendieck, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 23 15:41:42 np0005532763 systemd[1]: var-lib-containers-storage-overlay-0c840554526b060294df85cc7d432f89cf46417ce2faef9ad54e1f5d652a4be8-merged.mount: Deactivated successfully.
Nov 23 15:41:42 np0005532763 podman[79354]: 2025-11-23 20:41:42.835389606 +0000 UTC m=+0.577718089 container remove a240ca06ed6aa4c22ff626c875447bdb3ed67060bae6e135828a99127c53462a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=optimistic_grothendieck, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Nov 23 15:41:42 np0005532763 systemd[1]: libpod-conmon-a240ca06ed6aa4c22ff626c875447bdb3ed67060bae6e135828a99127c53462a.scope: Deactivated successfully.
Nov 23 15:41:43 np0005532763 podman[79393]: 2025-11-23 20:41:43.081896843 +0000 UTC m=+0.072044732 container create 9044dc9b77c19602c9015b583fdef4be094fe8d2bf53d77963f634706d7ba32f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_jackson, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True)
Nov 23 15:41:43 np0005532763 podman[79393]: 2025-11-23 20:41:43.054985316 +0000 UTC m=+0.045133215 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:41:43 np0005532763 systemd[1]: Started libpod-conmon-9044dc9b77c19602c9015b583fdef4be094fe8d2bf53d77963f634706d7ba32f.scope.
Nov 23 15:41:43 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:41:43 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ba986a57e82bd3b8b3cfe72c265bb0b626cfb9e6d18e7f2c9ddb12528ff6663/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:43 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ba986a57e82bd3b8b3cfe72c265bb0b626cfb9e6d18e7f2c9ddb12528ff6663/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:43 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ba986a57e82bd3b8b3cfe72c265bb0b626cfb9e6d18e7f2c9ddb12528ff6663/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:43 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ba986a57e82bd3b8b3cfe72c265bb0b626cfb9e6d18e7f2c9ddb12528ff6663/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 15:41:43 np0005532763 podman[79393]: 2025-11-23 20:41:43.323653249 +0000 UTC m=+0.313801198 container init 9044dc9b77c19602c9015b583fdef4be094fe8d2bf53d77963f634706d7ba32f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_jackson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 15:41:43 np0005532763 podman[79393]: 2025-11-23 20:41:43.338389338 +0000 UTC m=+0.328537227 container start 9044dc9b77c19602c9015b583fdef4be094fe8d2bf53d77963f634706d7ba32f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_jackson, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 23 15:41:43 np0005532763 podman[79393]: 2025-11-23 20:41:43.342644696 +0000 UTC m=+0.332792645 container attach 9044dc9b77c19602c9015b583fdef4be094fe8d2bf53d77963f634706d7ba32f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_jackson, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default)
Nov 23 15:41:43 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1003409241' entity='client.admin' 
Nov 23 15:41:43 np0005532763 python3[79438]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc#012 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:41:44 np0005532763 funny_jackson[79434]: [
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:    {
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:        "available": false,
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:        "being_replaced": false,
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:        "ceph_device_lvm": false,
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:        "device_id": "QEMU_DVD-ROM_QM00001",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:        "lsm_data": {},
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:        "lvs": [],
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:        "path": "/dev/sr0",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:        "rejected_reasons": [
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "Insufficient space (<5GB)",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "Has a FileSystem"
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:        ],
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:        "sys_api": {
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "actuators": null,
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "device_nodes": [
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:                "sr0"
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            ],
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "devname": "sr0",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "human_readable_size": "482.00 KB",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "id_bus": "ata",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "model": "QEMU DVD-ROM",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "nr_requests": "2",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "parent": "/dev/sr0",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "partitions": {},
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "path": "/dev/sr0",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "removable": "1",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "rev": "2.5+",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "ro": "0",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "rotational": "1",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "sas_address": "",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "sas_device_handle": "",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "scheduler_mode": "mq-deadline",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "sectors": 0,
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "sectorsize": "2048",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "size": 493568.0,
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "support_discard": "2048",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "type": "disk",
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:            "vendor": "QEMU"
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:        }
Nov 23 15:41:44 np0005532763 funny_jackson[79434]:    }
Nov 23 15:41:44 np0005532763 funny_jackson[79434]: ]
Nov 23 15:41:44 np0005532763 systemd[1]: libpod-9044dc9b77c19602c9015b583fdef4be094fe8d2bf53d77963f634706d7ba32f.scope: Deactivated successfully.
Nov 23 15:41:44 np0005532763 podman[79393]: 2025-11-23 20:41:44.275149969 +0000 UTC m=+1.265297818 container died 9044dc9b77c19602c9015b583fdef4be094fe8d2bf53d77963f634706d7ba32f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_jackson, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:41:44 np0005532763 systemd[1]: var-lib-containers-storage-overlay-4ba986a57e82bd3b8b3cfe72c265bb0b626cfb9e6d18e7f2c9ddb12528ff6663-merged.mount: Deactivated successfully.
Nov 23 15:41:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e32 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:41:44 np0005532763 podman[79393]: 2025-11-23 20:41:44.35221961 +0000 UTC m=+1.342367499 container remove 9044dc9b77c19602c9015b583fdef4be094fe8d2bf53d77963f634706d7ba32f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=funny_jackson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 23 15:41:44 np0005532763 systemd[1]: libpod-conmon-9044dc9b77c19602c9015b583fdef4be094fe8d2bf53d77963f634706d7ba32f.scope: Deactivated successfully.
Nov 23 15:41:44 np0005532763 ceph-osd[78269]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 38.813 iops: 9936.101 elapsed_sec: 0.302
Nov 23 15:41:44 np0005532763 ceph-osd[78269]: log_channel(cluster) log [WRN] : OSD bench result of 9936.100737 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 23 15:41:44 np0005532763 ceph-osd[78269]: osd.2 0 waiting for initial osdmap
Nov 23 15:41:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2[78265]: 2025-11-23T20:41:44.696+0000 7f4b7af14640 -1 osd.2 0 waiting for initial osdmap
Nov 23 15:41:44 np0005532763 ceph-osd[78269]: osd.2 32 crush map has features 288514051259236352, adjusting msgr requires for clients
Nov 23 15:41:44 np0005532763 ceph-osd[78269]: osd.2 32 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Nov 23 15:41:44 np0005532763 ceph-osd[78269]: osd.2 32 crush map has features 3314933000852226048, adjusting msgr requires for osds
Nov 23 15:41:44 np0005532763 ceph-osd[78269]: osd.2 32 check_osdmap_features require_osd_release unknown -> squid
Nov 23 15:41:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-osd-2[78265]: 2025-11-23T20:41:44.725+0000 7f4b7653c640 -1 osd.2 32 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 23 15:41:44 np0005532763 ceph-osd[78269]: osd.2 32 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Nov 23 15:41:44 np0005532763 ceph-osd[78269]: osd.2 32 set_numa_affinity not setting numa affinity
Nov 23 15:41:44 np0005532763 ceph-osd[78269]: osd.2 32 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Nov 23 15:41:45 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:45 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1887137413' entity='client.admin' 
Nov 23 15:41:45 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:45 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Nov 23 15:41:45 np0005532763 ceph-mon[75752]: Adjusting osd_memory_target on compute-2 to 128.0M
Nov 23 15:41:45 np0005532763 ceph-mon[75752]: Unable to set osd_memory_target on compute-2 to 134217728: error parsing value: Value '134217728' is below minimum 939524096
Nov 23 15:41:45 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:41:45 np0005532763 ceph-mon[75752]: Updating compute-0:/etc/ceph/ceph.conf
Nov 23 15:41:45 np0005532763 ceph-mon[75752]: Updating compute-1:/etc/ceph/ceph.conf
Nov 23 15:41:45 np0005532763 ceph-mon[75752]: Updating compute-2:/etc/ceph/ceph.conf
Nov 23 15:41:45 np0005532763 ceph-mon[75752]: Updating compute-1:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.conf
Nov 23 15:41:45 np0005532763 ceph-mon[75752]: Updating compute-0:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.conf
Nov 23 15:41:45 np0005532763 ceph-mon[75752]: Updating compute-2:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.conf
Nov 23 15:41:45 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1515026058' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Nov 23 15:41:45 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e33 e33: 3 total, 3 up, 3 in
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 33 state: booting -> active
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[2.18( empty local-lis/les=0/0 n=0 ec=20/12 lis/c=20/20 les/c/f=21/21/0 sis=33) [2] r=0 lpr=33 pi=[20,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[4.15( empty local-lis/les=0/0 n=0 ec=22/14 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[2.12( empty local-lis/les=0/0 n=0 ec=20/12 lis/c=20/20 les/c/f=21/21/0 sis=33) [2] r=0 lpr=33 pi=[20,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[3.15( empty local-lis/les=0/0 n=0 ec=22/13 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[3.11( empty local-lis/les=0/0 n=0 ec=22/13 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[3.e( empty local-lis/les=0/0 n=0 ec=22/13 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[4.8( empty local-lis/les=0/0 n=0 ec=22/14 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[4.1f( empty local-lis/les=0/0 n=0 ec=22/14 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[5.4( empty local-lis/les=0/0 n=0 ec=24/16 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[2.b( empty local-lis/les=0/0 n=0 ec=20/12 lis/c=20/20 les/c/f=21/21/0 sis=33) [2] r=0 lpr=33 pi=[20,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[4.9( empty local-lis/les=0/0 n=0 ec=22/14 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[2.f( empty local-lis/les=0/0 n=0 ec=20/12 lis/c=20/20 les/c/f=21/21/0 sis=33) [2] r=0 lpr=33 pi=[20,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[2.5( empty local-lis/les=0/0 n=0 ec=20/12 lis/c=20/20 les/c/f=21/21/0 sis=33) [2] r=0 lpr=33 pi=[20,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[3.9( empty local-lis/les=0/0 n=0 ec=22/13 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[4.1( empty local-lis/les=0/0 n=0 ec=22/14 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[3.1d( empty local-lis/les=0/0 n=0 ec=22/13 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[2.1d( empty local-lis/les=0/0 n=0 ec=20/12 lis/c=20/20 les/c/f=21/21/0 sis=33) [2] r=0 lpr=33 pi=[20,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[5.e( empty local-lis/les=0/0 n=0 ec=24/16 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[5.1a( empty local-lis/les=0/0 n=0 ec=24/16 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[3.1a( empty local-lis/les=0/0 n=0 ec=22/13 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:45 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 33 pg[2.1c( empty local-lis/les=0/0 n=0 ec=20/12 lis/c=20/20 les/c/f=21/21/0 sis=33) [2] r=0 lpr=33 pi=[20,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-mon[75752]: OSD bench result of 9936.100737 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Nov 23 15:41:46 np0005532763 ceph-mon[75752]: osd.2 [v2:192.168.122.102:6800/530987644,v1:192.168.122.102:6801/530987644] boot
Nov 23 15:41:46 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1515026058' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Nov 23 15:41:46 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:46 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:46 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:46 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:46 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:46 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:46 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:46 np0005532763 ceph-mon[75752]: from='mgr.14122 192.168.122.100:0/2507473718' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:41:46 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1621977935' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Nov 23 15:41:46 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e34 e34: 3 total, 3 up, 3 in
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.1f( empty local-lis/les=33/34 n=0 ec=22/14 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[3.15( empty local-lis/les=33/34 n=0 ec=22/13 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.b( empty local-lis/les=33/34 n=0 ec=20/12 lis/c=20/20 les/c/f=21/21/0 sis=33) [2] r=0 lpr=33 pi=[20,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[3.11( empty local-lis/les=33/34 n=0 ec=22/13 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.8( empty local-lis/les=33/34 n=0 ec=22/14 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.9( empty local-lis/les=33/34 n=0 ec=22/14 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.12( empty local-lis/les=33/34 n=0 ec=20/12 lis/c=20/20 les/c/f=21/21/0 sis=33) [2] r=0 lpr=33 pi=[20,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.18( empty local-lis/les=33/34 n=0 ec=20/12 lis/c=20/20 les/c/f=21/21/0 sis=33) [2] r=0 lpr=33 pi=[20,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.f( empty local-lis/les=33/34 n=0 ec=20/12 lis/c=20/20 les/c/f=21/21/0 sis=33) [2] r=0 lpr=33 pi=[20,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[3.e( empty local-lis/les=33/34 n=0 ec=22/13 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[5.4( empty local-lis/les=33/34 n=0 ec=24/16 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[3.1a( empty local-lis/les=33/34 n=0 ec=22/13 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[5.e( empty local-lis/les=33/34 n=0 ec=24/16 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[5.1a( empty local-lis/les=33/34 n=0 ec=24/16 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.15( empty local-lis/les=33/34 n=0 ec=22/14 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.1( empty local-lis/les=33/34 n=0 ec=22/14 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[3.9( empty local-lis/les=33/34 n=0 ec=22/13 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.1d( empty local-lis/les=33/34 n=0 ec=20/12 lis/c=20/20 les/c/f=21/21/0 sis=33) [2] r=0 lpr=33 pi=[20,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.5( empty local-lis/les=33/34 n=0 ec=20/12 lis/c=20/20 les/c/f=21/21/0 sis=33) [2] r=0 lpr=33 pi=[20,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[3.1d( empty local-lis/les=33/34 n=0 ec=22/13 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.1c( empty local-lis/les=33/34 n=0 ec=20/12 lis/c=20/20 les/c/f=21/21/0 sis=33) [2] r=0 lpr=33 pi=[20,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.15( empty local-lis/les=0/0 n=0 ec=20/12 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=34 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[5.12( empty local-lis/les=0/0 n=0 ec=24/16 lis/c=24/24 les/c/f=26/26/0 sis=33) [2] r=0 lpr=34 pi=[24,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[5.13( empty local-lis/les=0/0 n=0 ec=24/16 lis/c=24/24 les/c/f=26/26/0 sis=33) [2] r=0 lpr=34 pi=[24,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.14( empty local-lis/les=0/0 n=0 ec=22/14 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[5.8( empty local-lis/les=0/0 n=0 ec=24/16 lis/c=24/24 les/c/f=26/26/0 sis=33) [2] r=0 lpr=34 pi=[24,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.c( empty local-lis/les=0/0 n=0 ec=20/12 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=34 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.10( empty local-lis/les=0/0 n=0 ec=20/12 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=34 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.13( empty local-lis/les=0/0 n=0 ec=20/12 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=34 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.d( empty local-lis/les=0/0 n=0 ec=20/12 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=34 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[5.b( empty local-lis/les=0/0 n=0 ec=24/16 lis/c=24/24 les/c/f=26/26/0 sis=33) [2] r=0 lpr=34 pi=[24,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.a( empty local-lis/les=0/0 n=0 ec=20/12 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=34 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[5.d( empty local-lis/les=0/0 n=0 ec=24/16 lis/c=24/24 les/c/f=26/26/0 sis=33) [2] r=0 lpr=34 pi=[24,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[3.0( empty local-lis/les=0/0 n=0 ec=13/13 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[5.0( empty local-lis/les=0/0 n=0 ec=16/16 lis/c=24/24 les/c/f=26/26/0 sis=33) [2] r=0 lpr=34 pi=[24,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.2( empty local-lis/les=0/0 n=0 ec=22/14 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.6( empty local-lis/les=0/0 n=0 ec=22/14 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.3( empty local-lis/les=0/0 n=0 ec=22/14 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[3.8( empty local-lis/les=0/0 n=0 ec=22/13 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.1b( empty local-lis/les=0/0 n=0 ec=20/12 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=34 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.1c( empty local-lis/les=0/0 n=0 ec=22/14 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.19( empty local-lis/les=0/0 n=0 ec=22/14 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[3.1b( empty local-lis/les=0/0 n=0 ec=22/13 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.1d( empty local-lis/les=0/0 n=0 ec=22/14 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.15( empty local-lis/les=33/34 n=0 ec=20/12 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=34 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.13( empty local-lis/les=33/34 n=0 ec=20/12 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=34 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[5.8( empty local-lis/les=33/34 n=0 ec=24/16 lis/c=24/24 les/c/f=26/26/0 sis=33) [2] r=0 lpr=34 pi=[24,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[5.12( empty local-lis/les=33/34 n=0 ec=24/16 lis/c=24/24 les/c/f=26/26/0 sis=33) [2] r=0 lpr=34 pi=[24,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[5.13( empty local-lis/les=33/34 n=0 ec=24/16 lis/c=24/24 les/c/f=26/26/0 sis=33) [2] r=0 lpr=34 pi=[24,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.c( empty local-lis/les=33/34 n=0 ec=20/12 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=34 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[5.b( empty local-lis/les=33/34 n=0 ec=24/16 lis/c=24/24 les/c/f=26/26/0 sis=33) [2] r=0 lpr=34 pi=[24,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[5.d( empty local-lis/les=33/34 n=0 ec=24/16 lis/c=24/24 les/c/f=26/26/0 sis=33) [2] r=0 lpr=34 pi=[24,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.14( empty local-lis/les=33/34 n=0 ec=22/14 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.d( empty local-lis/les=33/34 n=0 ec=20/12 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=34 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.a( empty local-lis/les=33/34 n=0 ec=20/12 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=34 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[5.0( empty local-lis/les=33/34 n=0 ec=16/16 lis/c=24/24 les/c/f=26/26/0 sis=33) [2] r=0 lpr=34 pi=[24,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.3( empty local-lis/les=33/34 n=0 ec=22/14 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.6( empty local-lis/les=33/34 n=0 ec=22/14 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[3.0( empty local-lis/les=33/34 n=0 ec=13/13 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.1b( empty local-lis/les=33/34 n=0 ec=20/12 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=34 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[3.8( empty local-lis/les=33/34 n=0 ec=22/13 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[2.10( empty local-lis/les=33/34 n=0 ec=20/12 lis/c=29/29 les/c/f=30/30/0 sis=33) [2] r=0 lpr=34 pi=[29,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.19( empty local-lis/les=33/34 n=0 ec=22/14 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.2( empty local-lis/les=33/34 n=0 ec=22/14 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[3.1b( empty local-lis/les=33/34 n=0 ec=22/13 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.1d( empty local-lis/les=33/34 n=0 ec=22/14 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 34 pg[4.1c( empty local-lis/les=33/34 n=0 ec=22/14 lis/c=22/22 les/c/f=23/23/0 sis=33) [2] r=0 lpr=34 pi=[22,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr handle_mgr_map respawning because set of enabled modules changed!
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr respawn  e: '/usr/bin/ceph-mgr'
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr respawn  0: '/usr/bin/ceph-mgr'
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr respawn  1: '-n'
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr respawn  2: 'mgr.compute-2.jtkauz'
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr respawn  3: '-f'
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr respawn  4: '--setuser'
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr respawn  5: 'ceph'
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr respawn  6: '--setgroup'
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr respawn  7: 'ceph'
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr respawn  8: '--default-log-to-file=false'
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr respawn  9: '--default-log-to-journald=true'
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr respawn  10: '--default-log-to-stderr=false'
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr respawn  exe_path /proc/self/exe
Nov 23 15:41:47 np0005532763 systemd[1]: session-27.scope: Deactivated successfully.
Nov 23 15:41:47 np0005532763 systemd[1]: session-29.scope: Deactivated successfully.
Nov 23 15:41:47 np0005532763 systemd[1]: session-28.scope: Deactivated successfully.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Session 29 logged out. Waiting for processes to exit.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Session 27 logged out. Waiting for processes to exit.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Session 28 logged out. Waiting for processes to exit.
Nov 23 15:41:47 np0005532763 systemd[1]: session-30.scope: Deactivated successfully.
Nov 23 15:41:47 np0005532763 systemd[1]: session-20.scope: Deactivated successfully.
Nov 23 15:41:47 np0005532763 systemd[1]: session-32.scope: Deactivated successfully.
Nov 23 15:41:47 np0005532763 systemd[1]: session-32.scope: Consumed 1min 10.605s CPU time.
Nov 23 15:41:47 np0005532763 systemd[1]: session-25.scope: Deactivated successfully.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Removed session 27.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Session 20 logged out. Waiting for processes to exit.
Nov 23 15:41:47 np0005532763 systemd[1]: session-31.scope: Deactivated successfully.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Session 30 logged out. Waiting for processes to exit.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Session 25 logged out. Waiting for processes to exit.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Session 32 logged out. Waiting for processes to exit.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Session 31 logged out. Waiting for processes to exit.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Removed session 29.
Nov 23 15:41:47 np0005532763 systemd[1]: session-23.scope: Deactivated successfully.
Nov 23 15:41:47 np0005532763 systemd[1]: session-26.scope: Deactivated successfully.
Nov 23 15:41:47 np0005532763 systemd[1]: session-22.scope: Deactivated successfully.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Session 23 logged out. Waiting for processes to exit.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Session 26 logged out. Waiting for processes to exit.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Session 22 logged out. Waiting for processes to exit.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Removed session 28.
Nov 23 15:41:47 np0005532763 systemd[1]: session-24.scope: Deactivated successfully.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Session 24 logged out. Waiting for processes to exit.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Removed session 30.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Removed session 20.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Removed session 32.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Removed session 25.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Removed session 31.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Removed session 23.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Removed session 26.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Removed session 22.
Nov 23 15:41:47 np0005532763 systemd-logind[830]: Removed session 24.
Nov 23 15:41:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: ignoring --setuser ceph since I am not root
Nov 23 15:41:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: ignoring --setgroup ceph since I am not root
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: pidfile_write: ignore empty --pid-file
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'alerts'
Nov 23 15:41:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:47.533+0000 7f72a0f08140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'balancer'
Nov 23 15:41:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:47.606+0000 7f72a0f08140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 23 15:41:47 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'cephadm'
Nov 23 15:41:47 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1621977935' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Nov 23 15:41:47 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.1f scrub starts
Nov 23 15:41:47 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.1f scrub ok
Nov 23 15:41:48 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'crash'
Nov 23 15:41:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:48.414+0000 7f72a0f08140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 23 15:41:48 np0005532763 ceph-mgr[76063]: mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 23 15:41:48 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'dashboard'
Nov 23 15:41:48 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.8 scrub starts
Nov 23 15:41:48 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.8 scrub ok
Nov 23 15:41:48 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'devicehealth'
Nov 23 15:41:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:49.011+0000 7f72a0f08140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 23 15:41:49 np0005532763 ceph-mgr[76063]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 23 15:41:49 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'diskprediction_local'
Nov 23 15:41:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Nov 23 15:41:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Nov 23 15:41:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]:  from numpy import show_config as show_numpy_config
Nov 23 15:41:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:49.174+0000 7f72a0f08140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 23 15:41:49 np0005532763 ceph-mgr[76063]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 23 15:41:49 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'influx'
Nov 23 15:41:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:49.241+0000 7f72a0f08140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 23 15:41:49 np0005532763 ceph-mgr[76063]: mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 23 15:41:49 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'insights'
Nov 23 15:41:49 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'iostat'
Nov 23 15:41:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e34 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:41:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:49.372+0000 7f72a0f08140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 23 15:41:49 np0005532763 ceph-mgr[76063]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 23 15:41:49 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'k8sevents'
Nov 23 15:41:49 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Nov 23 15:41:49 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Nov 23 15:41:49 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'localpool'
Nov 23 15:41:49 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'mds_autoscaler'
Nov 23 15:41:49 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'mirroring'
Nov 23 15:41:50 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'nfs'
Nov 23 15:41:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:50.309+0000 7f72a0f08140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 23 15:41:50 np0005532763 ceph-mgr[76063]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 23 15:41:50 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'orchestrator'
Nov 23 15:41:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:50.518+0000 7f72a0f08140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 23 15:41:50 np0005532763 ceph-mgr[76063]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 23 15:41:50 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'osd_perf_query'
Nov 23 15:41:50 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Nov 23 15:41:50 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Nov 23 15:41:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:50.594+0000 7f72a0f08140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 23 15:41:50 np0005532763 ceph-mgr[76063]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 23 15:41:50 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'osd_support'
Nov 23 15:41:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:50.661+0000 7f72a0f08140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 23 15:41:50 np0005532763 ceph-mgr[76063]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 23 15:41:50 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'pg_autoscaler'
Nov 23 15:41:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:50.741+0000 7f72a0f08140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 23 15:41:50 np0005532763 ceph-mgr[76063]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 23 15:41:50 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'progress'
Nov 23 15:41:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:50.809+0000 7f72a0f08140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 23 15:41:50 np0005532763 ceph-mgr[76063]: mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 23 15:41:50 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'prometheus'
Nov 23 15:41:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:51.131+0000 7f72a0f08140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 23 15:41:51 np0005532763 ceph-mgr[76063]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 23 15:41:51 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'rbd_support'
Nov 23 15:41:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:51.226+0000 7f72a0f08140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 23 15:41:51 np0005532763 ceph-mgr[76063]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 23 15:41:51 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'restful'
Nov 23 15:41:51 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'rgw'
Nov 23 15:41:51 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 3.e scrub starts
Nov 23 15:41:51 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 3.e scrub ok
Nov 23 15:41:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:51.638+0000 7f72a0f08140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 23 15:41:51 np0005532763 ceph-mgr[76063]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 23 15:41:51 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'rook'
Nov 23 15:41:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:52.185+0000 7f72a0f08140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 23 15:41:52 np0005532763 ceph-mgr[76063]: mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 23 15:41:52 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'selftest'
Nov 23 15:41:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:52.253+0000 7f72a0f08140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 23 15:41:52 np0005532763 ceph-mgr[76063]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 23 15:41:52 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'snap_schedule'
Nov 23 15:41:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:52.332+0000 7f72a0f08140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 23 15:41:52 np0005532763 ceph-mgr[76063]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 23 15:41:52 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'stats'
Nov 23 15:41:52 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'status'
Nov 23 15:41:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:52.473+0000 7f72a0f08140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Nov 23 15:41:52 np0005532763 ceph-mgr[76063]: mgr[py] Module status has missing NOTIFY_TYPES member
Nov 23 15:41:52 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'telegraf'
Nov 23 15:41:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:52.539+0000 7f72a0f08140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 23 15:41:52 np0005532763 ceph-mgr[76063]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 23 15:41:52 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'telemetry'
Nov 23 15:41:52 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 3.1a deep-scrub starts
Nov 23 15:41:52 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 3.1a deep-scrub ok
Nov 23 15:41:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:52.699+0000 7f72a0f08140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 23 15:41:52 np0005532763 ceph-mgr[76063]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 23 15:41:52 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'test_orchestrator'
Nov 23 15:41:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:52.915+0000 7f72a0f08140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 23 15:41:52 np0005532763 ceph-mgr[76063]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 23 15:41:52 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'volumes'
Nov 23 15:41:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:53.166+0000 7f72a0f08140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 23 15:41:53 np0005532763 ceph-mgr[76063]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 23 15:41:53 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'zabbix'
Nov 23 15:41:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:41:53.233+0000 7f72a0f08140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 23 15:41:53 np0005532763 ceph-mgr[76063]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 23 15:41:53 np0005532763 ceph-mgr[76063]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 15:41:53 np0005532763 ceph-mgr[76063]: mgr load Constructed class from module: dashboard
Nov 23 15:41:53 np0005532763 ceph-mgr[76063]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Nov 23 15:41:53 np0005532763 ceph-mgr[76063]: [dashboard INFO root] Configured CherryPy, starting engine...
Nov 23 15:41:53 np0005532763 ceph-mgr[76063]: [dashboard INFO root] Starting engine...
Nov 23 15:41:53 np0005532763 ceph-mgr[76063]: ms_deliver_dispatch: unhandled message 0x55afc82a5860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Nov 23 15:41:53 np0005532763 ceph-mgr[76063]: [dashboard INFO root] Engine started...
Nov 23 15:41:53 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.1 deep-scrub starts
Nov 23 15:41:53 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.1 deep-scrub ok
Nov 23 15:41:53 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e35 e35: 3 total, 3 up, 3 in
Nov 23 15:41:53 np0005532763 ceph-mon[75752]: Active manager daemon compute-0.oyehye restarted
Nov 23 15:41:53 np0005532763 ceph-mon[75752]: Activating manager daemon compute-0.oyehye
Nov 23 15:41:54 np0005532763 systemd-logind[830]: New session 33 of user ceph-admin.
Nov 23 15:41:54 np0005532763 systemd[1]: Started Session 33 of User ceph-admin.
Nov 23 15:41:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:41:54 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Nov 23 15:41:54 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Nov 23 15:41:54 np0005532763 ceph-mon[75752]: Manager daemon compute-0.oyehye is now available
Nov 23 15:41:54 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.oyehye/mirror_snapshot_schedule"}]: dispatch
Nov 23 15:41:54 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.oyehye/trash_purge_schedule"}]: dispatch
Nov 23 15:41:55 np0005532763 podman[81158]: 2025-11-23 20:41:55.261644634 +0000 UTC m=+0.087637555 container exec 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:41:55 np0005532763 podman[81158]: 2025-11-23 20:41:55.411750874 +0000 UTC m=+0.237743795 container exec_died 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Nov 23 15:41:55 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Nov 23 15:41:55 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Nov 23 15:41:55 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:55 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:55 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:55 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:55 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:55 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:56 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 5.e scrub starts
Nov 23 15:41:56 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 5.e scrub ok
Nov 23 15:41:56 np0005532763 ceph-mon[75752]: [23/Nov/2025:20:41:55] ENGINE Bus STARTING
Nov 23 15:41:56 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:56 np0005532763 ceph-mon[75752]: [23/Nov/2025:20:41:55] ENGINE Serving on https://192.168.122.100:7150
Nov 23 15:41:56 np0005532763 ceph-mon[75752]: [23/Nov/2025:20:41:55] ENGINE Client ('192.168.122.100', 34418) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 23 15:41:56 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:56 np0005532763 ceph-mon[75752]: [23/Nov/2025:20:41:55] ENGINE Serving on http://192.168.122.100:8765
Nov 23 15:41:56 np0005532763 ceph-mon[75752]: [23/Nov/2025:20:41:55] ENGINE Bus STARTED
Nov 23 15:41:56 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:56 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:56 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Nov 23 15:41:56 np0005532763 ceph-mon[75752]: Adjusting osd_memory_target on compute-0 to 127.9M
Nov 23 15:41:56 np0005532763 ceph-mon[75752]: Unable to set osd_memory_target on compute-0 to 134211993: error parsing value: Value '134211993' is below minimum 939524096
Nov 23 15:41:56 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:56 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:56 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Nov 23 15:41:56 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:57 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Nov 23 15:41:57 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Nov 23 15:41:57 np0005532763 ceph-mon[75752]: Adjusting osd_memory_target on compute-1 to 128.0M
Nov 23 15:41:57 np0005532763 ceph-mon[75752]: Unable to set osd_memory_target on compute-1 to 134217728: error parsing value: Value '134217728' is below minimum 939524096
Nov 23 15:41:57 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:57 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:57 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Nov 23 15:41:57 np0005532763 ceph-mon[75752]: Adjusting osd_memory_target on compute-2 to 128.0M
Nov 23 15:41:57 np0005532763 ceph-mon[75752]: Unable to set osd_memory_target on compute-2 to 134217728: error parsing value: Value '134217728' is below minimum 939524096
Nov 23 15:41:57 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:41:57 np0005532763 ceph-mon[75752]: Updating compute-0:/etc/ceph/ceph.conf
Nov 23 15:41:57 np0005532763 ceph-mon[75752]: Updating compute-1:/etc/ceph/ceph.conf
Nov 23 15:41:57 np0005532763 ceph-mon[75752]: Updating compute-2:/etc/ceph/ceph.conf
Nov 23 15:41:57 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:58 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Nov 23 15:41:58 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Nov 23 15:41:58 np0005532763 ceph-mon[75752]: Updating compute-1:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.conf
Nov 23 15:41:58 np0005532763 ceph-mon[75752]: Updating compute-0:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.conf
Nov 23 15:41:58 np0005532763 ceph-mon[75752]: Updating compute-2:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.conf
Nov 23 15:41:58 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:41:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:41:59 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Nov 23 15:41:59 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr handle_mgr_map respawning because set of enabled modules changed!
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr respawn  e: '/usr/bin/ceph-mgr'
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr respawn  0: '/usr/bin/ceph-mgr'
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr respawn  1: '-n'
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr respawn  2: 'mgr.compute-2.jtkauz'
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr respawn  3: '-f'
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr respawn  4: '--setuser'
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr respawn  5: 'ceph'
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr respawn  6: '--setgroup'
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr respawn  7: 'ceph'
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr respawn  8: '--default-log-to-file=false'
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr respawn  9: '--default-log-to-journald=true'
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr respawn  10: '--default-log-to-stderr=false'
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr respawn  exe_path /proc/self/exe
Nov 23 15:42:00 np0005532763 systemd[1]: session-33.scope: Deactivated successfully.
Nov 23 15:42:00 np0005532763 systemd[1]: session-33.scope: Consumed 6.093s CPU time.
Nov 23 15:42:00 np0005532763 systemd-logind[830]: Session 33 logged out. Waiting for processes to exit.
Nov 23 15:42:00 np0005532763 systemd-logind[830]: Removed session 33.
Nov 23 15:42:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: ignoring --setuser ceph since I am not root
Nov 23 15:42:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: ignoring --setgroup ceph since I am not root
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: pidfile_write: ignore empty --pid-file
Nov 23 15:42:00 np0005532763 ceph-mon[75752]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Nov 23 15:42:00 np0005532763 ceph-mon[75752]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Nov 23 15:42:00 np0005532763 ceph-mon[75752]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Nov 23 15:42:00 np0005532763 ceph-mon[75752]: Updating compute-1:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.client.admin.keyring
Nov 23 15:42:00 np0005532763 ceph-mon[75752]: Updating compute-0:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.client.admin.keyring
Nov 23 15:42:00 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/319512723' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Nov 23 15:42:00 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:00 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:00 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:00 np0005532763 ceph-mon[75752]: from='mgr.14337 192.168.122.100:0/1869846579' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'alerts'
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'balancer'
Nov 23 15:42:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:00.369+0000 7fdaa0de9140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 23 15:42:00 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'cephadm'
Nov 23 15:42:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:00.457+0000 7fdaa0de9140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 23 15:42:00 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Nov 23 15:42:00 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Nov 23 15:42:01 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'crash'
Nov 23 15:42:01 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/319512723' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Nov 23 15:42:01 np0005532763 ceph-mgr[76063]: mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 23 15:42:01 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'dashboard'
Nov 23 15:42:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:01.252+0000 7fdaa0de9140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 23 15:42:01 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Nov 23 15:42:01 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Nov 23 15:42:01 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'devicehealth'
Nov 23 15:42:01 np0005532763 ceph-mgr[76063]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 23 15:42:01 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'diskprediction_local'
Nov 23 15:42:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:01.854+0000 7fdaa0de9140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 23 15:42:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Nov 23 15:42:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Nov 23 15:42:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]:  from numpy import show_config as show_numpy_config
Nov 23 15:42:02 np0005532763 ceph-mgr[76063]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 23 15:42:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:02.008+0000 7fdaa0de9140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 23 15:42:02 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'influx'
Nov 23 15:42:02 np0005532763 ceph-mgr[76063]: mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 23 15:42:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:02.076+0000 7fdaa0de9140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 23 15:42:02 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'insights'
Nov 23 15:42:02 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'iostat'
Nov 23 15:42:02 np0005532763 ceph-mgr[76063]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 23 15:42:02 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'k8sevents'
Nov 23 15:42:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:02.209+0000 7fdaa0de9140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 23 15:42:02 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/2985907711' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Nov 23 15:42:02 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'localpool'
Nov 23 15:42:02 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'mds_autoscaler'
Nov 23 15:42:02 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 2.c scrub starts
Nov 23 15:42:02 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 2.c scrub ok
Nov 23 15:42:02 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'mirroring'
Nov 23 15:42:02 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'nfs'
Nov 23 15:42:03 np0005532763 ceph-mgr[76063]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 23 15:42:03 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'orchestrator'
Nov 23 15:42:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:03.121+0000 7fdaa0de9140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 23 15:42:03 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/2985907711' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Nov 23 15:42:03 np0005532763 ceph-mgr[76063]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 23 15:42:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:03.333+0000 7fdaa0de9140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 23 15:42:03 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'osd_perf_query'
Nov 23 15:42:03 np0005532763 ceph-mgr[76063]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 23 15:42:03 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'osd_support'
Nov 23 15:42:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:03.403+0000 7fdaa0de9140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 23 15:42:03 np0005532763 ceph-mgr[76063]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 23 15:42:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:03.465+0000 7fdaa0de9140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 23 15:42:03 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'pg_autoscaler'
Nov 23 15:42:03 np0005532763 ceph-mgr[76063]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 23 15:42:03 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'progress'
Nov 23 15:42:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:03.538+0000 7fdaa0de9140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 23 15:42:03 np0005532763 ceph-mgr[76063]: mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 23 15:42:03 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'prometheus'
Nov 23 15:42:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:03.608+0000 7fdaa0de9140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 23 15:42:03 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 5.b scrub starts
Nov 23 15:42:03 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 5.b scrub ok
Nov 23 15:42:03 np0005532763 ceph-mgr[76063]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 23 15:42:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:03.930+0000 7fdaa0de9140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 23 15:42:03 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'rbd_support'
Nov 23 15:42:04 np0005532763 ceph-mgr[76063]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 23 15:42:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:04.022+0000 7fdaa0de9140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 23 15:42:04 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'restful'
Nov 23 15:42:04 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'rgw'
Nov 23 15:42:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:42:04 np0005532763 ceph-mgr[76063]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 23 15:42:04 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'rook'
Nov 23 15:42:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:04.436+0000 7fdaa0de9140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 23 15:42:04 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 5.d scrub starts
Nov 23 15:42:04 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 5.d scrub ok
Nov 23 15:42:04 np0005532763 ceph-mgr[76063]: mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 23 15:42:04 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'selftest'
Nov 23 15:42:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:04.957+0000 7fdaa0de9140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:05.022+0000 7fdaa0de9140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'snap_schedule'
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'stats'
Nov 23 15:42:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:05.094+0000 7fdaa0de9140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'status'
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr[py] Module status has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'telegraf'
Nov 23 15:42:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:05.228+0000 7fdaa0de9140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:05.293+0000 7fdaa0de9140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'telemetry'
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'test_orchestrator'
Nov 23 15:42:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:05.441+0000 7fdaa0de9140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 2.d scrub starts
Nov 23 15:42:05 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 2.d scrub ok
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'volumes'
Nov 23 15:42:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:05.648+0000 7fdaa0de9140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'zabbix'
Nov 23 15:42:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:05.887+0000 7fdaa0de9140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:05.950+0000 7fdaa0de9140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: ms_deliver_dispatch: unhandled message 0x5562d5613860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr handle_mgr_map respawning because set of enabled modules changed!
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr respawn  e: '/usr/bin/ceph-mgr'
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr respawn  0: '/usr/bin/ceph-mgr'
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr respawn  1: '-n'
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr respawn  2: 'mgr.compute-2.jtkauz'
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr respawn  3: '-f'
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr respawn  4: '--setuser'
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr respawn  5: 'ceph'
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr respawn  6: '--setgroup'
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr respawn  7: 'ceph'
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr respawn  8: '--default-log-to-file=false'
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr respawn  9: '--default-log-to-journald=true'
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr respawn  10: '--default-log-to-stderr=false'
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Nov 23 15:42:05 np0005532763 ceph-mgr[76063]: mgr respawn  exe_path /proc/self/exe
Nov 23 15:42:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: ignoring --setuser ceph since I am not root
Nov 23 15:42:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: ignoring --setgroup ceph since I am not root
Nov 23 15:42:06 np0005532763 ceph-mgr[76063]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Nov 23 15:42:06 np0005532763 ceph-mgr[76063]: pidfile_write: ignore empty --pid-file
Nov 23 15:42:06 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'alerts'
Nov 23 15:42:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:06.192+0000 7f2ddc64b140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 23 15:42:06 np0005532763 ceph-mgr[76063]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 23 15:42:06 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'balancer'
Nov 23 15:42:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:06.264+0000 7f2ddc64b140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 23 15:42:06 np0005532763 ceph-mgr[76063]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 23 15:42:06 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'cephadm'
Nov 23 15:42:06 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e36 e36: 3 total, 3 up, 3 in
Nov 23 15:42:06 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 2.a scrub starts
Nov 23 15:42:06 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 2.a scrub ok
Nov 23 15:42:06 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'crash'
Nov 23 15:42:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:07.009+0000 7f2ddc64b140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 23 15:42:07 np0005532763 ceph-mgr[76063]: mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 23 15:42:07 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'dashboard'
Nov 23 15:42:07 np0005532763 ceph-mon[75752]: Active manager daemon compute-0.oyehye restarted
Nov 23 15:42:07 np0005532763 ceph-mon[75752]: Activating manager daemon compute-0.oyehye
Nov 23 15:42:07 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'devicehealth'
Nov 23 15:42:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:07.570+0000 7f2ddc64b140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 23 15:42:07 np0005532763 ceph-mgr[76063]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 23 15:42:07 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'diskprediction_local'
Nov 23 15:42:07 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 5.0 scrub starts
Nov 23 15:42:07 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 5.0 scrub ok
Nov 23 15:42:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Nov 23 15:42:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Nov 23 15:42:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]:  from numpy import show_config as show_numpy_config
Nov 23 15:42:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:07.722+0000 7f2ddc64b140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 23 15:42:07 np0005532763 ceph-mgr[76063]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 23 15:42:07 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'influx'
Nov 23 15:42:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:07.788+0000 7f2ddc64b140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 23 15:42:07 np0005532763 ceph-mgr[76063]: mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 23 15:42:07 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'insights'
Nov 23 15:42:07 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'iostat'
Nov 23 15:42:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:07.918+0000 7f2ddc64b140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 23 15:42:07 np0005532763 ceph-mgr[76063]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 23 15:42:07 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'k8sevents'
Nov 23 15:42:08 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'localpool'
Nov 23 15:42:08 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'mds_autoscaler'
Nov 23 15:42:08 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'mirroring'
Nov 23 15:42:08 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.6 scrub starts
Nov 23 15:42:08 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.6 scrub ok
Nov 23 15:42:08 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'nfs'
Nov 23 15:42:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:08.839+0000 7f2ddc64b140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 23 15:42:08 np0005532763 ceph-mgr[76063]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 23 15:42:08 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'orchestrator'
Nov 23 15:42:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:09.042+0000 7f2ddc64b140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 23 15:42:09 np0005532763 ceph-mgr[76063]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 23 15:42:09 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'osd_perf_query'
Nov 23 15:42:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:09.114+0000 7f2ddc64b140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 23 15:42:09 np0005532763 ceph-mgr[76063]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 23 15:42:09 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'osd_support'
Nov 23 15:42:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:09.177+0000 7f2ddc64b140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 23 15:42:09 np0005532763 ceph-mgr[76063]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 23 15:42:09 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'pg_autoscaler'
Nov 23 15:42:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:09.250+0000 7f2ddc64b140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 23 15:42:09 np0005532763 ceph-mgr[76063]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 23 15:42:09 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'progress'
Nov 23 15:42:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:09.315+0000 7f2ddc64b140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 23 15:42:09 np0005532763 ceph-mgr[76063]: mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 23 15:42:09 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'prometheus'
Nov 23 15:42:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:42:09 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.3 scrub starts
Nov 23 15:42:09 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.3 scrub ok
Nov 23 15:42:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:09.633+0000 7f2ddc64b140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 23 15:42:09 np0005532763 ceph-mgr[76063]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 23 15:42:09 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'rbd_support'
Nov 23 15:42:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:09.725+0000 7f2ddc64b140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 23 15:42:09 np0005532763 ceph-mgr[76063]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 23 15:42:09 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'restful'
Nov 23 15:42:09 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'rgw'
Nov 23 15:42:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:10.120+0000 7f2ddc64b140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 23 15:42:10 np0005532763 ceph-mgr[76063]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 23 15:42:10 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'rook'
Nov 23 15:42:10 np0005532763 systemd[1]: Stopping User Manager for UID 42477...
Nov 23 15:42:10 np0005532763 systemd[72454]: Activating special unit Exit the Session...
Nov 23 15:42:10 np0005532763 systemd[72454]: Stopped target Main User Target.
Nov 23 15:42:10 np0005532763 systemd[72454]: Stopped target Basic System.
Nov 23 15:42:10 np0005532763 systemd[72454]: Stopped target Paths.
Nov 23 15:42:10 np0005532763 systemd[72454]: Stopped target Sockets.
Nov 23 15:42:10 np0005532763 systemd[72454]: Stopped target Timers.
Nov 23 15:42:10 np0005532763 systemd[72454]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 23 15:42:10 np0005532763 systemd[72454]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 23 15:42:10 np0005532763 systemd[72454]: Closed D-Bus User Message Bus Socket.
Nov 23 15:42:10 np0005532763 systemd[72454]: Stopped Create User's Volatile Files and Directories.
Nov 23 15:42:10 np0005532763 systemd[72454]: Removed slice User Application Slice.
Nov 23 15:42:10 np0005532763 systemd[72454]: Reached target Shutdown.
Nov 23 15:42:10 np0005532763 systemd[72454]: Finished Exit the Session.
Nov 23 15:42:10 np0005532763 systemd[72454]: Reached target Exit the Session.
Nov 23 15:42:10 np0005532763 systemd[1]: user@42477.service: Deactivated successfully.
Nov 23 15:42:10 np0005532763 systemd[1]: Stopped User Manager for UID 42477.
Nov 23 15:42:10 np0005532763 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Nov 23 15:42:10 np0005532763 systemd[1]: run-user-42477.mount: Deactivated successfully.
Nov 23 15:42:10 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 3.0 scrub starts
Nov 23 15:42:10 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 3.0 scrub ok
Nov 23 15:42:10 np0005532763 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Nov 23 15:42:10 np0005532763 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Nov 23 15:42:10 np0005532763 systemd[1]: Removed slice User Slice of UID 42477.
Nov 23 15:42:10 np0005532763 systemd[1]: user-42477.slice: Consumed 1min 18.506s CPU time.
Nov 23 15:42:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:10.630+0000 7f2ddc64b140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 23 15:42:10 np0005532763 ceph-mgr[76063]: mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 23 15:42:10 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'selftest'
Nov 23 15:42:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:10.694+0000 7f2ddc64b140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 23 15:42:10 np0005532763 ceph-mgr[76063]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 23 15:42:10 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'snap_schedule'
Nov 23 15:42:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:10.766+0000 7f2ddc64b140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 23 15:42:10 np0005532763 ceph-mgr[76063]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 23 15:42:10 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'stats'
Nov 23 15:42:10 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'status'
Nov 23 15:42:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:10.899+0000 7f2ddc64b140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Nov 23 15:42:10 np0005532763 ceph-mgr[76063]: mgr[py] Module status has missing NOTIFY_TYPES member
Nov 23 15:42:10 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'telegraf'
Nov 23 15:42:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:10.962+0000 7f2ddc64b140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 23 15:42:10 np0005532763 ceph-mgr[76063]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 23 15:42:10 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'telemetry'
Nov 23 15:42:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:11.102+0000 7f2ddc64b140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 23 15:42:11 np0005532763 ceph-mgr[76063]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 23 15:42:11 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'test_orchestrator'
Nov 23 15:42:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:11.300+0000 7f2ddc64b140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 23 15:42:11 np0005532763 ceph-mgr[76063]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 23 15:42:11 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'volumes'
Nov 23 15:42:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:11.539+0000 7f2ddc64b140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 23 15:42:11 np0005532763 ceph-mgr[76063]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 23 15:42:11 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'zabbix'
Nov 23 15:42:11 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 2.10 deep-scrub starts
Nov 23 15:42:11 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 2.10 deep-scrub ok
Nov 23 15:42:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:42:11.602+0000 7f2ddc64b140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 23 15:42:11 np0005532763 ceph-mgr[76063]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 23 15:42:11 np0005532763 ceph-mgr[76063]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 15:42:11 np0005532763 ceph-mgr[76063]: mgr load Constructed class from module: dashboard
Nov 23 15:42:11 np0005532763 ceph-mgr[76063]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Nov 23 15:42:11 np0005532763 ceph-mgr[76063]: ms_deliver_dispatch: unhandled message 0x55d6098b1860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Nov 23 15:42:11 np0005532763 ceph-mgr[76063]: [dashboard INFO root] Configured CherryPy, starting engine...
Nov 23 15:42:11 np0005532763 ceph-mgr[76063]: [dashboard INFO root] Starting engine...
Nov 23 15:42:11 np0005532763 ceph-mgr[76063]: [dashboard INFO root] Engine started...
Nov 23 15:42:12 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Nov 23 15:42:12 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Nov 23 15:42:12 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e37 e37: 3 total, 3 up, 3 in
Nov 23 15:42:13 np0005532763 systemd-logind[830]: New session 34 of user ceph-admin.
Nov 23 15:42:13 np0005532763 systemd[1]: Created slice User Slice of UID 42477.
Nov 23 15:42:13 np0005532763 systemd[1]: Starting User Runtime Directory /run/user/42477...
Nov 23 15:42:13 np0005532763 systemd[1]: Finished User Runtime Directory /run/user/42477.
Nov 23 15:42:13 np0005532763 systemd[1]: Starting User Manager for UID 42477...
Nov 23 15:42:13 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Nov 23 15:42:13 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Nov 23 15:42:13 np0005532763 systemd[82270]: Queued start job for default target Main User Target.
Nov 23 15:42:13 np0005532763 systemd[82270]: Created slice User Application Slice.
Nov 23 15:42:13 np0005532763 systemd[82270]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 23 15:42:13 np0005532763 systemd[82270]: Started Daily Cleanup of User's Temporary Directories.
Nov 23 15:42:13 np0005532763 systemd[82270]: Reached target Paths.
Nov 23 15:42:13 np0005532763 systemd[82270]: Reached target Timers.
Nov 23 15:42:13 np0005532763 systemd[82270]: Starting D-Bus User Message Bus Socket...
Nov 23 15:42:13 np0005532763 systemd[82270]: Starting Create User's Volatile Files and Directories...
Nov 23 15:42:13 np0005532763 systemd[82270]: Listening on D-Bus User Message Bus Socket.
Nov 23 15:42:13 np0005532763 systemd[82270]: Reached target Sockets.
Nov 23 15:42:13 np0005532763 systemd[82270]: Finished Create User's Volatile Files and Directories.
Nov 23 15:42:13 np0005532763 systemd[82270]: Reached target Basic System.
Nov 23 15:42:13 np0005532763 systemd[82270]: Reached target Main User Target.
Nov 23 15:42:13 np0005532763 systemd[82270]: Startup finished in 163ms.
Nov 23 15:42:13 np0005532763 systemd[1]: Started User Manager for UID 42477.
Nov 23 15:42:13 np0005532763 systemd[1]: Started Session 34 of User ceph-admin.
Nov 23 15:42:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:42:14 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Nov 23 15:42:14 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Nov 23 15:42:14 np0005532763 podman[82409]: 2025-11-23 20:42:14.620779079 +0000 UTC m=+0.097155970 container exec 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Nov 23 15:42:14 np0005532763 podman[82409]: 2025-11-23 20:42:14.739860986 +0000 UTC m=+0.216237867 container exec_died 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 15:42:15 np0005532763 ceph-mon[75752]: Active manager daemon compute-0.oyehye restarted
Nov 23 15:42:15 np0005532763 ceph-mon[75752]: Activating manager daemon compute-0.oyehye
Nov 23 15:42:15 np0005532763 ceph-mon[75752]: Manager daemon compute-0.oyehye is now available
Nov 23 15:42:15 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.oyehye/mirror_snapshot_schedule"}]: dispatch
Nov 23 15:42:15 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.oyehye/trash_purge_schedule"}]: dispatch
Nov 23 15:42:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e2 new map
Nov 23 15:42:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e2 print_map#012e2#012btime 2025-11-23T20:42:15:389935+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-23T20:42:15.389822+0000#012modified#0112025-11-23T20:42:15.389822+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 
Nov 23 15:42:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e38 e38: 3 total, 3 up, 3 in
Nov 23 15:42:15 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Nov 23 15:42:15 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: [23/Nov/2025:20:42:14] ENGINE Bus STARTING
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: [23/Nov/2025:20:42:14] ENGINE Serving on http://192.168.122.100:8765
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: [23/Nov/2025:20:42:14] ENGINE Serving on https://192.168.122.100:7150
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: [23/Nov/2025:20:42:14] ENGINE Bus STARTED
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: [23/Nov/2025:20:42:14] ENGINE Client ('192.168.122.100', 49202) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:16 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Nov 23 15:42:16 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Nov 23 15:42:17 np0005532763 ceph-mon[75752]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Nov 23 15:42:17 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:17 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:17 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:17 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Nov 23 15:42:17 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:17 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:17 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Nov 23 15:42:17 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:17 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:17 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Nov 23 15:42:17 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:42:17 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.1c deep-scrub starts
Nov 23 15:42:17 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 4.1c deep-scrub ok
Nov 23 15:42:18 np0005532763 ceph-mon[75752]: Adjusting osd_memory_target on compute-1 to 128.0M
Nov 23 15:42:18 np0005532763 ceph-mon[75752]: Unable to set osd_memory_target on compute-1 to 134217728: error parsing value: Value '134217728' is below minimum 939524096
Nov 23 15:42:18 np0005532763 ceph-mon[75752]: Adjusting osd_memory_target on compute-2 to 128.0M
Nov 23 15:42:18 np0005532763 ceph-mon[75752]: Unable to set osd_memory_target on compute-2 to 134217728: error parsing value: Value '134217728' is below minimum 939524096
Nov 23 15:42:18 np0005532763 ceph-mon[75752]: Adjusting osd_memory_target on compute-0 to 127.9M
Nov 23 15:42:18 np0005532763 ceph-mon[75752]: Unable to set osd_memory_target on compute-0 to 134211993: error parsing value: Value '134211993' is below minimum 939524096
Nov 23 15:42:18 np0005532763 ceph-mon[75752]: Updating compute-0:/etc/ceph/ceph.conf
Nov 23 15:42:18 np0005532763 ceph-mon[75752]: Updating compute-1:/etc/ceph/ceph.conf
Nov 23 15:42:18 np0005532763 ceph-mon[75752]: Updating compute-2:/etc/ceph/ceph.conf
Nov 23 15:42:18 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Nov 23 15:42:18 np0005532763 ceph-mon[75752]: Updating compute-1:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.conf
Nov 23 15:42:18 np0005532763 ceph-mon[75752]: Updating compute-0:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.conf
Nov 23 15:42:18 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e39 e39: 3 total, 3 up, 3 in
Nov 23 15:42:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:42:19 np0005532763 ceph-mon[75752]: Updating compute-2:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.conf
Nov 23 15:42:19 np0005532763 ceph-mon[75752]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Nov 23 15:42:19 np0005532763 ceph-mon[75752]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Nov 23 15:42:19 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Nov 23 15:42:19 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Nov 23 15:42:19 np0005532763 ceph-mon[75752]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Nov 23 15:42:19 np0005532763 ceph-mon[75752]: Updating compute-0:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.client.admin.keyring
Nov 23 15:42:19 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:19 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:19 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:19 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e40 e40: 3 total, 3 up, 3 in
Nov 23 15:42:20 np0005532763 ceph-mon[75752]: Updating compute-1:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.client.admin.keyring
Nov 23 15:42:20 np0005532763 ceph-mon[75752]: Updating compute-2:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.client.admin.keyring
Nov 23 15:42:20 np0005532763 ceph-mon[75752]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 23 15:42:20 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Nov 23 15:42:20 np0005532763 ceph-mon[75752]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Nov 23 15:42:20 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:20 np0005532763 ceph-mon[75752]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Nov 23 15:42:20 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:20 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:20 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:20 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:20 np0005532763 ceph-mon[75752]: Deploying daemon node-exporter.compute-0 on compute-0
Nov 23 15:42:20 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e41 e41: 3 total, 3 up, 3 in
Nov 23 15:42:22 np0005532763 ceph-mon[75752]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Nov 23 15:42:22 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1678765881' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Nov 23 15:42:22 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1678765881' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Nov 23 15:42:23 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:23 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:23 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:23 np0005532763 ceph-mon[75752]: Deploying daemon node-exporter.compute-1 on compute-1
Nov 23 15:42:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:42:25 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:25 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:25 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:25 np0005532763 ceph-mon[75752]: Deploying daemon node-exporter.compute-2 on compute-2
Nov 23 15:42:25 np0005532763 systemd[1]: Reloading.
Nov 23 15:42:25 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:42:25 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:42:26 np0005532763 systemd[1]: Reloading.
Nov 23 15:42:26 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:42:26 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:42:26 np0005532763 systemd[1]: Starting Ceph node-exporter.compute-2 for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:42:26 np0005532763 bash[83756]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Nov 23 15:42:27 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/1987053989' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Nov 23 15:42:27 np0005532763 bash[83756]: Getting image source signatures
Nov 23 15:42:27 np0005532763 bash[83756]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Nov 23 15:42:27 np0005532763 bash[83756]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Nov 23 15:42:27 np0005532763 bash[83756]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Nov 23 15:42:28 np0005532763 bash[83756]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Nov 23 15:42:28 np0005532763 bash[83756]: Writing manifest to image destination
Nov 23 15:42:28 np0005532763 podman[83756]: 2025-11-23 20:42:28.118230716 +0000 UTC m=+1.336152619 container create bfa89024a4f3a8c3745fbdf8141ab9c1af6ff603988de647c9e7f7e15dff8638 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 15:42:28 np0005532763 podman[83756]: 2025-11-23 20:42:28.095621068 +0000 UTC m=+1.313543021 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Nov 23 15:42:28 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/360d38d9c3a7fb33cafea32c618dd167d0ef2afba72459d39bd1654286991baa/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Nov 23 15:42:28 np0005532763 podman[83756]: 2025-11-23 20:42:28.183349243 +0000 UTC m=+1.401271126 container init bfa89024a4f3a8c3745fbdf8141ab9c1af6ff603988de647c9e7f7e15dff8638 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 15:42:28 np0005532763 podman[83756]: 2025-11-23 20:42:28.187490702 +0000 UTC m=+1.405412565 container start bfa89024a4f3a8c3745fbdf8141ab9c1af6ff603988de647c9e7f7e15dff8638 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 15:42:28 np0005532763 bash[83756]: bfa89024a4f3a8c3745fbdf8141ab9c1af6ff603988de647c9e7f7e15dff8638
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.201Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.202Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Nov 23 15:42:28 np0005532763 systemd[1]: Started Ceph node-exporter.compute-2 for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.203Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.203Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.206Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.206Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.207Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.207Z caller=node_exporter.go:117 level=info collector=arp
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.207Z caller=node_exporter.go:117 level=info collector=bcache
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.207Z caller=node_exporter.go:117 level=info collector=bonding
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.207Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.207Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.207Z caller=node_exporter.go:117 level=info collector=cpu
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.207Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.207Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.207Z caller=node_exporter.go:117 level=info collector=dmi
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.207Z caller=node_exporter.go:117 level=info collector=edac
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.208Z caller=node_exporter.go:117 level=info collector=entropy
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.208Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.208Z caller=node_exporter.go:117 level=info collector=filefd
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.208Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.208Z caller=node_exporter.go:117 level=info collector=hwmon
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.208Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.208Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.208Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.208Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.208Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.208Z caller=node_exporter.go:117 level=info collector=netclass
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.208Z caller=node_exporter.go:117 level=info collector=netdev
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.208Z caller=node_exporter.go:117 level=info collector=netstat
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.209Z caller=node_exporter.go:117 level=info collector=nfs
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.209Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.209Z caller=node_exporter.go:117 level=info collector=nvme
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.209Z caller=node_exporter.go:117 level=info collector=os
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.209Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.209Z caller=node_exporter.go:117 level=info collector=pressure
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.209Z caller=node_exporter.go:117 level=info collector=rapl
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.209Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.209Z caller=node_exporter.go:117 level=info collector=selinux
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.209Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.209Z caller=node_exporter.go:117 level=info collector=softnet
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.210Z caller=node_exporter.go:117 level=info collector=stat
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.210Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.210Z caller=node_exporter.go:117 level=info collector=textfile
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.210Z caller=node_exporter.go:117 level=info collector=thermal_zone
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.210Z caller=node_exporter.go:117 level=info collector=time
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.210Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.210Z caller=node_exporter.go:117 level=info collector=uname
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.210Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.210Z caller=node_exporter.go:117 level=info collector=xfs
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.210Z caller=node_exporter.go:117 level=info collector=zfs
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.212Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Nov 23 15:42:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2[83833]: ts=2025-11-23T20:42:28.213Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Nov 23 15:42:29 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:29 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:29 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:29 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:29 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:42:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:42:33 np0005532763 podman[83931]: 2025-11-23 20:42:33.138397105 +0000 UTC m=+0.057552781 container create 0a7616bab92956b6c1b24068d7ecc96a3286024e7e69351bc9a3c7c8e73aae47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=charming_elbakyan, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Nov 23 15:42:33 np0005532763 podman[83931]: 2025-11-23 20:42:33.111245306 +0000 UTC m=+0.030401022 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:42:33 np0005532763 systemd[1]: Started libpod-conmon-0a7616bab92956b6c1b24068d7ecc96a3286024e7e69351bc9a3c7c8e73aae47.scope.
Nov 23 15:42:33 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:42:33 np0005532763 podman[83931]: 2025-11-23 20:42:33.266438975 +0000 UTC m=+0.185594711 container init 0a7616bab92956b6c1b24068d7ecc96a3286024e7e69351bc9a3c7c8e73aae47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=charming_elbakyan, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:42:33 np0005532763 podman[83931]: 2025-11-23 20:42:33.279552011 +0000 UTC m=+0.198707697 container start 0a7616bab92956b6c1b24068d7ecc96a3286024e7e69351bc9a3c7c8e73aae47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=charming_elbakyan, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Nov 23 15:42:33 np0005532763 podman[83931]: 2025-11-23 20:42:33.286068407 +0000 UTC m=+0.205224143 container attach 0a7616bab92956b6c1b24068d7ecc96a3286024e7e69351bc9a3c7c8e73aae47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=charming_elbakyan, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:42:33 np0005532763 systemd[1]: libpod-0a7616bab92956b6c1b24068d7ecc96a3286024e7e69351bc9a3c7c8e73aae47.scope: Deactivated successfully.
Nov 23 15:42:33 np0005532763 charming_elbakyan[83947]: 167 167
Nov 23 15:42:33 np0005532763 conmon[83947]: conmon 0a7616bab92956b6c1b2 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-0a7616bab92956b6c1b24068d7ecc96a3286024e7e69351bc9a3c7c8e73aae47.scope/container/memory.events
Nov 23 15:42:33 np0005532763 podman[83931]: 2025-11-23 20:42:33.29245126 +0000 UTC m=+0.211606896 container died 0a7616bab92956b6c1b24068d7ecc96a3286024e7e69351bc9a3c7c8e73aae47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=charming_elbakyan, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:42:33 np0005532763 systemd[1]: var-lib-containers-storage-overlay-481c1b8fa9bca18f58250510fd5bef6918dbdeb9432e0ea3f018ba88f95b0b7c-merged.mount: Deactivated successfully.
Nov 23 15:42:33 np0005532763 podman[83931]: 2025-11-23 20:42:33.342487895 +0000 UTC m=+0.261643551 container remove 0a7616bab92956b6c1b24068d7ecc96a3286024e7e69351bc9a3c7c8e73aae47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=charming_elbakyan, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Nov 23 15:42:33 np0005532763 systemd[1]: libpod-conmon-0a7616bab92956b6c1b24068d7ecc96a3286024e7e69351bc9a3c7c8e73aae47.scope: Deactivated successfully.
Nov 23 15:42:33 np0005532763 systemd[1]: Reloading.
Nov 23 15:42:33 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:42:33 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:42:33 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:33 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:33 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.cwocqr", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 23 15:42:33 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.cwocqr", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 23 15:42:33 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:33 np0005532763 ceph-mon[75752]: Deploying daemon rgw.rgw.compute-2.cwocqr on compute-2
Nov 23 15:42:33 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:33 np0005532763 systemd[1]: Reloading.
Nov 23 15:42:33 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:42:33 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:42:33 np0005532763 systemd[1]: Starting Ceph rgw.rgw.compute-2.cwocqr for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:42:34 np0005532763 podman[84093]: 2025-11-23 20:42:34.300789234 +0000 UTC m=+0.064530751 container create 93555d176e1c6acce7737f700608561ace078807462da1bf977591b98db2b962 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-rgw-rgw-compute-2-cwocqr, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 23 15:42:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:42:34 np0005532763 podman[84093]: 2025-11-23 20:42:34.268688414 +0000 UTC m=+0.032429981 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:42:34 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b74f2837cc2d17bdecd3e946bd0882b21ae9e30923bdf34a6031bc3a662ac7e9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:42:34 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b74f2837cc2d17bdecd3e946bd0882b21ae9e30923bdf34a6031bc3a662ac7e9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 15:42:34 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b74f2837cc2d17bdecd3e946bd0882b21ae9e30923bdf34a6031bc3a662ac7e9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 15:42:34 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b74f2837cc2d17bdecd3e946bd0882b21ae9e30923bdf34a6031bc3a662ac7e9/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-2.cwocqr supports timestamps until 2038 (0x7fffffff)
Nov 23 15:42:34 np0005532763 podman[84093]: 2025-11-23 20:42:34.38751841 +0000 UTC m=+0.151259947 container init 93555d176e1c6acce7737f700608561ace078807462da1bf977591b98db2b962 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-rgw-rgw-compute-2-cwocqr, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 23 15:42:34 np0005532763 podman[84093]: 2025-11-23 20:42:34.398013431 +0000 UTC m=+0.161754948 container start 93555d176e1c6acce7737f700608561ace078807462da1bf977591b98db2b962 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-rgw-rgw-compute-2-cwocqr, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 15:42:34 np0005532763 bash[84093]: 93555d176e1c6acce7737f700608561ace078807462da1bf977591b98db2b962
Nov 23 15:42:34 np0005532763 systemd[1]: Started Ceph rgw.rgw.compute-2.cwocqr for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:42:34 np0005532763 radosgw[84112]: deferred set uid:gid to 167:167 (ceph:ceph)
Nov 23 15:42:34 np0005532763 radosgw[84112]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Nov 23 15:42:34 np0005532763 radosgw[84112]: framework: beast
Nov 23 15:42:34 np0005532763 radosgw[84112]: framework conf key: endpoint, val: 192.168.122.102:8082
Nov 23 15:42:34 np0005532763 radosgw[84112]: init_numa not setting numa affinity
Nov 23 15:42:34 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:34 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:34 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:34 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.exwrda", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 23 15:42:34 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.exwrda", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 23 15:42:34 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:35 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e42 e42: 3 total, 3 up, 3 in
Nov 23 15:42:35 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0)
Nov 23 15:42:35 np0005532763 ceph-mon[75752]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1418789177' entity='client.rgw.rgw.compute-2.cwocqr' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Nov 23 15:42:35 np0005532763 ceph-mon[75752]: Deploying daemon rgw.rgw.compute-1.exwrda on compute-1
Nov 23 15:42:35 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.102:0/1418789177' entity='client.rgw.rgw.compute-2.cwocqr' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Nov 23 15:42:35 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-2.cwocqr' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Nov 23 15:42:36 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e43 e43: 3 total, 3 up, 3 in
Nov 23 15:42:36 np0005532763 radosgw[84112]: rgw main: failed to create zone with (17) File exists
Nov 23 15:42:36 np0005532763 radosgw[84112]: rgw main: failed to create zonegroup with (17) File exists
Nov 23 15:42:36 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:36 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:36 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:36 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.lntkpb", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 23 15:42:36 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.lntkpb", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 23 15:42:36 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:36 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-2.cwocqr' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Nov 23 15:42:37 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e44 e44: 3 total, 3 up, 3 in
Nov 23 15:42:37 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Nov 23 15:42:37 np0005532763 ceph-mon[75752]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/141380246' entity='client.rgw.rgw.compute-2.cwocqr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Nov 23 15:42:37 np0005532763 ceph-mon[75752]: Deploying daemon rgw.rgw.compute-0.lntkpb on compute-0
Nov 23 15:42:37 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.102:0/141380246' entity='client.rgw.rgw.compute-2.cwocqr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Nov 23 15:42:37 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-2.cwocqr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Nov 23 15:42:37 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.101:0/4191610001' entity='client.rgw.rgw.compute-1.exwrda' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Nov 23 15:42:37 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-1.exwrda' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Nov 23 15:42:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e45 e45: 3 total, 3 up, 3 in
Nov 23 15:42:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e45 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:42:39 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:39 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:39 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:39 np0005532763 ceph-mon[75752]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Nov 23 15:42:39 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-2.cwocqr' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Nov 23 15:42:39 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-1.exwrda' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Nov 23 15:42:39 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:39 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:39 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.utubtn", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Nov 23 15:42:39 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.utubtn", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Nov 23 15:42:39 np0005532763 ceph-mon[75752]: Deploying daemon mds.cephfs.compute-2.utubtn on compute-2
Nov 23 15:42:39 np0005532763 podman[84791]: 2025-11-23 20:42:39.898215339 +0000 UTC m=+0.057755306 container create 520c7a07e80907423b4d6cf64e4f5b356612ba9236a7504d3c44787f1f303954 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_gagarin, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:42:39 np0005532763 systemd[1]: Started libpod-conmon-520c7a07e80907423b4d6cf64e4f5b356612ba9236a7504d3c44787f1f303954.scope.
Nov 23 15:42:39 np0005532763 podman[84791]: 2025-11-23 20:42:39.878832444 +0000 UTC m=+0.038372441 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:42:39 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:42:40 np0005532763 podman[84791]: 2025-11-23 20:42:40.005817844 +0000 UTC m=+0.165357891 container init 520c7a07e80907423b4d6cf64e4f5b356612ba9236a7504d3c44787f1f303954 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_gagarin, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Nov 23 15:42:40 np0005532763 podman[84791]: 2025-11-23 20:42:40.012005131 +0000 UTC m=+0.171545088 container start 520c7a07e80907423b4d6cf64e4f5b356612ba9236a7504d3c44787f1f303954 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_gagarin, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:42:40 np0005532763 podman[84791]: 2025-11-23 20:42:40.0158047 +0000 UTC m=+0.175344727 container attach 520c7a07e80907423b4d6cf64e4f5b356612ba9236a7504d3c44787f1f303954 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_gagarin, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Nov 23 15:42:40 np0005532763 flamboyant_gagarin[84807]: 167 167
Nov 23 15:42:40 np0005532763 systemd[1]: libpod-520c7a07e80907423b4d6cf64e4f5b356612ba9236a7504d3c44787f1f303954.scope: Deactivated successfully.
Nov 23 15:42:40 np0005532763 conmon[84807]: conmon 520c7a07e80907423b4d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-520c7a07e80907423b4d6cf64e4f5b356612ba9236a7504d3c44787f1f303954.scope/container/memory.events
Nov 23 15:42:40 np0005532763 podman[84791]: 2025-11-23 20:42:40.018850187 +0000 UTC m=+0.178390174 container died 520c7a07e80907423b4d6cf64e4f5b356612ba9236a7504d3c44787f1f303954 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_gagarin, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Nov 23 15:42:40 np0005532763 systemd[1]: var-lib-containers-storage-overlay-74c9e62af82b0fbd04912904befa1624beb9ecfbd676815e46b283ca92d7ea2d-merged.mount: Deactivated successfully.
Nov 23 15:42:40 np0005532763 podman[84791]: 2025-11-23 20:42:40.055780146 +0000 UTC m=+0.215320143 container remove 520c7a07e80907423b4d6cf64e4f5b356612ba9236a7504d3c44787f1f303954 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=flamboyant_gagarin, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:42:40 np0005532763 systemd[1]: libpod-conmon-520c7a07e80907423b4d6cf64e4f5b356612ba9236a7504d3c44787f1f303954.scope: Deactivated successfully.
Nov 23 15:42:40 np0005532763 systemd[1]: Reloading.
Nov 23 15:42:40 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:42:40 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:42:40 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e46 e46: 3 total, 3 up, 3 in
Nov 23 15:42:40 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0)
Nov 23 15:42:40 np0005532763 ceph-mon[75752]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/141380246' entity='client.rgw.rgw.compute-2.cwocqr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 23 15:42:40 np0005532763 systemd[1]: Reloading.
Nov 23 15:42:40 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:42:40 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:42:40 np0005532763 ceph-mon[75752]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Nov 23 15:42:40 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.102:0/141380246' entity='client.rgw.rgw.compute-2.cwocqr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 23 15:42:40 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/4054506421' entity='client.rgw.rgw.compute-0.lntkpb' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 23 15:42:40 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-2.cwocqr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 23 15:42:40 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-1.exwrda' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 23 15:42:40 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.101:0/4191610001' entity='client.rgw.rgw.compute-1.exwrda' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Nov 23 15:42:40 np0005532763 systemd[1]: Starting Ceph mds.cephfs.compute-2.utubtn for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:42:41 np0005532763 podman[84948]: 2025-11-23 20:42:41.086328926 +0000 UTC m=+0.067992150 container create f4c59361b9ef6b6afaf42c46fc5da61dd3b48e0dfd06d88a483bdff0068b9b0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mds-cephfs-compute-2-utubtn, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 15:42:41 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9fdb70e038f679bcf617a44811f88ceb3a99802e93823c44b3a2cce66d07316/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 15:42:41 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9fdb70e038f679bcf617a44811f88ceb3a99802e93823c44b3a2cce66d07316/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:42:41 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9fdb70e038f679bcf617a44811f88ceb3a99802e93823c44b3a2cce66d07316/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 15:42:41 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9fdb70e038f679bcf617a44811f88ceb3a99802e93823c44b3a2cce66d07316/merged/var/lib/ceph/mds/ceph-cephfs.compute-2.utubtn supports timestamps until 2038 (0x7fffffff)
Nov 23 15:42:41 np0005532763 podman[84948]: 2025-11-23 20:42:41.058138738 +0000 UTC m=+0.039802012 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:42:41 np0005532763 podman[84948]: 2025-11-23 20:42:41.16463445 +0000 UTC m=+0.146297734 container init f4c59361b9ef6b6afaf42c46fc5da61dd3b48e0dfd06d88a483bdff0068b9b0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mds-cephfs-compute-2-utubtn, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 23 15:42:41 np0005532763 podman[84948]: 2025-11-23 20:42:41.174669858 +0000 UTC m=+0.156333072 container start f4c59361b9ef6b6afaf42c46fc5da61dd3b48e0dfd06d88a483bdff0068b9b0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mds-cephfs-compute-2-utubtn, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:42:41 np0005532763 bash[84948]: f4c59361b9ef6b6afaf42c46fc5da61dd3b48e0dfd06d88a483bdff0068b9b0f
Nov 23 15:42:41 np0005532763 systemd[1]: Started Ceph mds.cephfs.compute-2.utubtn for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:42:41 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e47 e47: 3 total, 3 up, 3 in
Nov 23 15:42:41 np0005532763 ceph-mds[84968]: set uid:gid to 167:167 (ceph:ceph)
Nov 23 15:42:41 np0005532763 ceph-mds[84968]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Nov 23 15:42:41 np0005532763 ceph-mds[84968]: main not setting numa affinity
Nov 23 15:42:41 np0005532763 ceph-mds[84968]: pidfile_write: ignore empty --pid-file
Nov 23 15:42:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mds-cephfs-compute-2-utubtn[84964]: starting mds.cephfs.compute-2.utubtn at 
Nov 23 15:42:41 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn Updating MDS map to version 2 from mon.1
Nov 23 15:42:42 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/4054506421' entity='client.rgw.rgw.compute-0.lntkpb' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Nov 23 15:42:42 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-2.cwocqr' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Nov 23 15:42:42 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-1.exwrda' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Nov 23 15:42:42 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:42 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:42 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:42 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.jcbopz", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Nov 23 15:42:42 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.jcbopz", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Nov 23 15:42:42 np0005532763 ceph-mon[75752]: Deploying daemon mds.cephfs.compute-0.jcbopz on compute-0
Nov 23 15:42:42 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e48 e48: 3 total, 3 up, 3 in
Nov 23 15:42:42 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Nov 23 15:42:42 np0005532763 ceph-mon[75752]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/141380246' entity='client.rgw.rgw.compute-2.cwocqr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 23 15:42:42 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e3 new map
Nov 23 15:42:42 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e3 print_map#012e3#012btime 2025-11-23T20:42:42:276651+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-23T20:42:15.389822+0000#012modified#0112025-11-23T20:42:15.389822+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.utubtn{-1:24181} state up:standby seq 1 addr [v2:192.168.122.102:6804/3232844591,v1:192.168.122.102:6805/3232844591] compat {c=[1],r=[1],i=[1fff]}]
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn Updating MDS map to version 3 from mon.1
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn Monitors have assigned me to become a standby
Nov 23 15:42:42 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e4 new map
Nov 23 15:42:42 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e4 print_map#012e4#012btime 2025-11-23T20:42:42:291982+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-23T20:42:15.389822+0000#012modified#0112025-11-23T20:42:42.291972+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24181}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-2.utubtn{0:24181} state up:creating seq 1 addr [v2:192.168.122.102:6804/3232844591,v1:192.168.122.102:6805/3232844591] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn Updating MDS map to version 4 from mon.1
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.0.4 handle_mds_map I am now mds.0.4
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.0.cache creating system inode with ino:0x1
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.0.cache creating system inode with ino:0x100
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.0.cache creating system inode with ino:0x600
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.0.cache creating system inode with ino:0x601
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.0.cache creating system inode with ino:0x602
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.0.cache creating system inode with ino:0x603
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.0.cache creating system inode with ino:0x604
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.0.cache creating system inode with ino:0x605
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.0.cache creating system inode with ino:0x606
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.0.cache creating system inode with ino:0x607
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.0.cache creating system inode with ino:0x608
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.0.cache creating system inode with ino:0x609
Nov 23 15:42:42 np0005532763 ceph-mds[84968]: mds.0.4 creating_done
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/4054506421' entity='client.rgw.rgw.compute-0.lntkpb' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-1.exwrda' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.101:0/4191610001' entity='client.rgw.rgw.compute-1.exwrda' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.102:0/141380246' entity='client.rgw.rgw.compute-2.cwocqr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: daemon mds.cephfs.compute-2.utubtn assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-2.cwocqr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: daemon mds.cephfs.compute-2.utubtn is now active in filesystem cephfs as rank 0
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.gmfhnm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.gmfhnm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e49 e49: 3 total, 3 up, 3 in
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e5 new map
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e5 print_map#012e5#012btime 2025-11-23T20:42:43:300630+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-23T20:42:15.389822+0000#012modified#0112025-11-23T20:42:43.300628+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24181}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24181 members: 24181#012[mds.cephfs.compute-2.utubtn{0:24181} state up:active seq 2 addr [v2:192.168.122.102:6804/3232844591,v1:192.168.122.102:6805/3232844591] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jcbopz{-1:14580} state up:standby seq 1 addr [v2:192.168.122.100:6806/3257423559,v1:192.168.122.100:6807/3257423559] compat {c=[1],r=[1],i=[1fff]}]
Nov 23 15:42:43 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn Updating MDS map to version 5 from mon.1
Nov 23 15:42:43 np0005532763 ceph-mds[84968]: mds.0.4 handle_mds_map I am now mds.0.4
Nov 23 15:42:43 np0005532763 ceph-mds[84968]: mds.0.4 handle_mds_map state change up:creating --> up:active
Nov 23 15:42:43 np0005532763 ceph-mds[84968]: mds.0.4 recovery_done -- successful recovery!
Nov 23 15:42:43 np0005532763 ceph-mds[84968]: mds.0.4 active_start
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/141380246' entity='client.rgw.rgw.compute-2.cwocqr' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e6 new map
Nov 23 15:42:43 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e6 print_map#012e6#012btime 2025-11-23T20:42:43:320643+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-23T20:42:15.389822+0000#012modified#0112025-11-23T20:42:43.300628+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24181}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24181 members: 24181#012[mds.cephfs.compute-2.utubtn{0:24181} state up:active seq 2 addr [v2:192.168.122.102:6804/3232844591,v1:192.168.122.102:6805/3232844591] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jcbopz{-1:14580} state up:standby seq 1 addr [v2:192.168.122.100:6806/3257423559,v1:192.168.122.100:6807/3257423559] compat {c=[1],r=[1],i=[1fff]}]
Nov 23 15:42:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e50 e50: 3 total, 3 up, 3 in
Nov 23 15:42:44 np0005532763 ceph-mon[75752]: Deploying daemon mds.cephfs.compute-1.gmfhnm on compute-1
Nov 23 15:42:44 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/4054506421' entity='client.rgw.rgw.compute-0.lntkpb' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Nov 23 15:42:44 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-1.exwrda' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Nov 23 15:42:44 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-2.cwocqr' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Nov 23 15:42:44 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/4054506421' entity='client.rgw.rgw.compute-0.lntkpb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 23 15:42:44 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.101:0/4191610001' entity='client.rgw.rgw.compute-1.exwrda' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 23 15:42:44 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.102:0/141380246' entity='client.rgw.rgw.compute-2.cwocqr' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 23 15:42:44 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-2.cwocqr' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 23 15:42:44 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-1.exwrda' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Nov 23 15:42:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e50 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:42:44 np0005532763 radosgw[84112]: v1 topic migration: starting v1 topic migration..
Nov 23 15:42:44 np0005532763 radosgw[84112]: LDAP not started since no server URIs were provided in the configuration.
Nov 23 15:42:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-rgw-rgw-compute-2-cwocqr[84108]: 2025-11-23T20:42:44.568+0000 7ff30fb2e980 -1 LDAP not started since no server URIs were provided in the configuration.
Nov 23 15:42:44 np0005532763 radosgw[84112]: v1 topic migration: finished v1 topic migration
Nov 23 15:42:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Nov 23 15:42:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Nov 23 15:42:44 np0005532763 radosgw[84112]: framework: beast
Nov 23 15:42:44 np0005532763 radosgw[84112]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Nov 23 15:42:44 np0005532763 radosgw[84112]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Nov 23 15:42:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Nov 23 15:42:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Nov 23 15:42:44 np0005532763 radosgw[84112]: starting handler: beast
Nov 23 15:42:44 np0005532763 radosgw[84112]: set uid:gid to 167:167 (ceph:ceph)
Nov 23 15:42:44 np0005532763 radosgw[84112]: mgrc service_daemon_register rgw.24175 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.102:8082,frontend_type#0=beast,hostname=compute-2,id=rgw.compute-2.cwocqr,kernel_description=#1 SMP PREEMPT_DYNAMIC Sat Nov 15 10:30:41 UTC 2025,kernel_version=5.14.0-639.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864320,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=7b74c4d0-333d-4a78-943d-fd3c4abdfa87,zone_name=default,zonegroup_id=3560ca63-18fc-44aa-8d4c-f5d89c554a9f,zonegroup_name=default}
Nov 23 15:42:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Nov 23 15:42:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Nov 23 15:42:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Nov 23 15:42:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Nov 23 15:42:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Nov 23 15:42:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='client.? 192.168.122.100:0/4054506421' entity='client.rgw.rgw.compute-0.lntkpb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-2.cwocqr' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='client.? ' entity='client.rgw.rgw.compute-1.exwrda' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: Creating key for client.nfs.cephfs.0.0.compute-1.fuxuha
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.fuxuha", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.fuxuha", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.fuxuha-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.fuxuha-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e7 new map
Nov 23 15:42:45 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e7 print_map#012e7#012btime 2025-11-23T20:42:45:339402+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-23T20:42:15.389822+0000#012modified#0112025-11-23T20:42:43.300628+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24181}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24181 members: 24181#012[mds.cephfs.compute-2.utubtn{0:24181} state up:active seq 2 addr [v2:192.168.122.102:6804/3232844591,v1:192.168.122.102:6805/3232844591] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jcbopz{-1:14580} state up:standby seq 1 addr [v2:192.168.122.100:6806/3257423559,v1:192.168.122.100:6807/3257423559] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.gmfhnm{-1:24284} state up:standby seq 1 addr [v2:192.168.122.101:6804/3633651935,v1:192.168.122.101:6805/3633651935] compat {c=[1],r=[1],i=[1fff]}]
Nov 23 15:42:46 np0005532763 ceph-mon[75752]: Rados config object exists: conf-nfs.cephfs
Nov 23 15:42:46 np0005532763 ceph-mon[75752]: Creating key for client.nfs.cephfs.0.0.compute-1.fuxuha-rgw
Nov 23 15:42:46 np0005532763 ceph-mon[75752]: Bind address in nfs.cephfs.0.0.compute-1.fuxuha's ganesha conf is defaulting to empty
Nov 23 15:42:46 np0005532763 ceph-mon[75752]: Deploying daemon nfs.cephfs.0.0.compute-1.fuxuha on compute-1
Nov 23 15:42:46 np0005532763 ceph-mon[75752]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Nov 23 15:42:46 np0005532763 ceph-mon[75752]: Cluster is now healthy
Nov 23 15:42:46 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:46 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e8 new map
Nov 23 15:42:46 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e8 print_map#012e8#012btime 2025-11-23T20:42:46:698669+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-23T20:42:15.389822+0000#012modified#0112025-11-23T20:42:46.341150+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24181}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24181 members: 24181#012[mds.cephfs.compute-2.utubtn{0:24181} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/3232844591,v1:192.168.122.102:6805/3232844591] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jcbopz{-1:14580} state up:standby seq 1 addr [v2:192.168.122.100:6806/3257423559,v1:192.168.122.100:6807/3257423559] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.gmfhnm{-1:24284} state up:standby seq 1 addr [v2:192.168.122.101:6804/3633651935,v1:192.168.122.101:6805/3633651935] compat {c=[1],r=[1],i=[1fff]}]
Nov 23 15:42:46 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn Updating MDS map to version 8 from mon.1
Nov 23 15:42:47 np0005532763 ceph-mds[84968]: mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Nov 23 15:42:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mds-cephfs-compute-2-utubtn[84964]: 2025-11-23T20:42:47.322+0000 7fa8cd19f640 -1 mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Nov 23 15:42:47 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:47 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:47 np0005532763 ceph-mon[75752]: Creating key for client.nfs.cephfs.1.0.compute-2.dqbktw
Nov 23 15:42:47 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.dqbktw", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Nov 23 15:42:47 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.dqbktw", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Nov 23 15:42:47 np0005532763 ceph-mon[75752]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Nov 23 15:42:47 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Nov 23 15:42:47 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Nov 23 15:42:47 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e9 new map
Nov 23 15:42:47 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e9 print_map#012e9#012btime 2025-11-23T20:42:47:710992+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-23T20:42:15.389822+0000#012modified#0112025-11-23T20:42:46.341150+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24181}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24181 members: 24181#012[mds.cephfs.compute-2.utubtn{0:24181} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/3232844591,v1:192.168.122.102:6805/3232844591] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jcbopz{-1:14580} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/3257423559,v1:192.168.122.100:6807/3257423559] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.gmfhnm{-1:24284} state up:standby seq 1 addr [v2:192.168.122.101:6804/3633651935,v1:192.168.122.101:6805/3633651935] compat {c=[1],r=[1],i=[1fff]}]
Nov 23 15:42:48 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e10 new map
Nov 23 15:42:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).mds e10 print_map#012e10#012btime 2025-11-23T20:42:49:046556+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-11-23T20:42:15.389822+0000#012modified#0112025-11-23T20:42:46.341150+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24181}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24181 members: 24181#012[mds.cephfs.compute-2.utubtn{0:24181} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/3232844591,v1:192.168.122.102:6805/3232844591] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.jcbopz{-1:14580} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/3257423559,v1:192.168.122.100:6807/3257423559] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.gmfhnm{-1:24284} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/3633651935,v1:192.168.122.101:6805/3633651935] compat {c=[1],r=[1],i=[1fff]}]
Nov 23 15:42:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e50 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:42:50 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Nov 23 15:42:50 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Nov 23 15:42:50 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.dqbktw-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 23 15:42:50 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.dqbktw-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 23 15:42:50 np0005532763 podman[85125]: 2025-11-23 20:42:50.439084663 +0000 UTC m=+0.067089714 container create 0db0defa589850ba6d9661bfb145c8a65f1a7b7985af7c144a7183abe64a8975 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_swirles, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Nov 23 15:42:50 np0005532763 systemd[1]: Started libpod-conmon-0db0defa589850ba6d9661bfb145c8a65f1a7b7985af7c144a7183abe64a8975.scope.
Nov 23 15:42:50 np0005532763 podman[85125]: 2025-11-23 20:42:50.409597928 +0000 UTC m=+0.037603019 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:42:50 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:42:50 np0005532763 podman[85125]: 2025-11-23 20:42:50.58583456 +0000 UTC m=+0.213839621 container init 0db0defa589850ba6d9661bfb145c8a65f1a7b7985af7c144a7183abe64a8975 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_swirles, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:42:50 np0005532763 podman[85125]: 2025-11-23 20:42:50.593415987 +0000 UTC m=+0.221421028 container start 0db0defa589850ba6d9661bfb145c8a65f1a7b7985af7c144a7183abe64a8975 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_swirles, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 23 15:42:50 np0005532763 podman[85125]: 2025-11-23 20:42:50.596787864 +0000 UTC m=+0.224792955 container attach 0db0defa589850ba6d9661bfb145c8a65f1a7b7985af7c144a7183abe64a8975 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_swirles, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Nov 23 15:42:50 np0005532763 vigilant_swirles[85142]: 167 167
Nov 23 15:42:50 np0005532763 systemd[1]: libpod-0db0defa589850ba6d9661bfb145c8a65f1a7b7985af7c144a7183abe64a8975.scope: Deactivated successfully.
Nov 23 15:42:50 np0005532763 podman[85125]: 2025-11-23 20:42:50.599876402 +0000 UTC m=+0.227881443 container died 0db0defa589850ba6d9661bfb145c8a65f1a7b7985af7c144a7183abe64a8975 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_swirles, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Nov 23 15:42:50 np0005532763 systemd[1]: var-lib-containers-storage-overlay-5db6edf58e7385746afe12600698d1137ec3407bff89a3b25fd9ce9f32b4a603-merged.mount: Deactivated successfully.
Nov 23 15:42:50 np0005532763 podman[85125]: 2025-11-23 20:42:50.651325827 +0000 UTC m=+0.279330838 container remove 0db0defa589850ba6d9661bfb145c8a65f1a7b7985af7c144a7183abe64a8975 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_swirles, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 23 15:42:50 np0005532763 systemd[1]: libpod-conmon-0db0defa589850ba6d9661bfb145c8a65f1a7b7985af7c144a7183abe64a8975.scope: Deactivated successfully.
Nov 23 15:42:50 np0005532763 systemd[1]: Reloading.
Nov 23 15:42:50 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:42:50 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:42:51 np0005532763 systemd[1]: Reloading.
Nov 23 15:42:51 np0005532763 ceph-mon[75752]: Rados config object exists: conf-nfs.cephfs
Nov 23 15:42:51 np0005532763 ceph-mon[75752]: Creating key for client.nfs.cephfs.1.0.compute-2.dqbktw-rgw
Nov 23 15:42:51 np0005532763 ceph-mon[75752]: Bind address in nfs.cephfs.1.0.compute-2.dqbktw's ganesha conf is defaulting to empty
Nov 23 15:42:51 np0005532763 ceph-mon[75752]: Deploying daemon nfs.cephfs.1.0.compute-2.dqbktw on compute-2
Nov 23 15:42:51 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:42:51 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:42:51 np0005532763 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:42:51 np0005532763 podman[85282]: 2025-11-23 20:42:51.638454952 +0000 UTC m=+0.066549938 container create bed36d32aed75483423ddad867bd3597890ddea7438fb907c2da911b493cc00a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Nov 23 15:42:51 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50f777721e929df053d35c3f357207cad5e44090b2a32521c900a1ca0dec6f96/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Nov 23 15:42:51 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50f777721e929df053d35c3f357207cad5e44090b2a32521c900a1ca0dec6f96/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:42:51 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50f777721e929df053d35c3f357207cad5e44090b2a32521c900a1ca0dec6f96/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:42:51 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50f777721e929df053d35c3f357207cad5e44090b2a32521c900a1ca0dec6f96/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.dqbktw-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:42:51 np0005532763 podman[85282]: 2025-11-23 20:42:51.610574373 +0000 UTC m=+0.038669399 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:42:51 np0005532763 podman[85282]: 2025-11-23 20:42:51.722336247 +0000 UTC m=+0.150431263 container init bed36d32aed75483423ddad867bd3597890ddea7438fb907c2da911b493cc00a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 23 15:42:51 np0005532763 podman[85282]: 2025-11-23 20:42:51.739025875 +0000 UTC m=+0.167120861 container start bed36d32aed75483423ddad867bd3597890ddea7438fb907c2da911b493cc00a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:42:51 np0005532763 bash[85282]: bed36d32aed75483423ddad867bd3597890ddea7438fb907c2da911b493cc00a
Nov 23 15:42:51 np0005532763 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:42:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:51 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Nov 23 15:42:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:51 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Nov 23 15:42:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:51 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Nov 23 15:42:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:51 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Nov 23 15:42:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:51 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Nov 23 15:42:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:51 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Nov 23 15:42:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:51 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Nov 23 15:42:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:51 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:42:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:51 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Nov 23 15:42:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:51 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Nov 23 15:42:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:51 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:42:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:51 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:42:52 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:52 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:52 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:52 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.bfglcy", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Nov 23 15:42:52 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.bfglcy", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Nov 23 15:42:52 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Nov 23 15:42:52 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Nov 23 15:42:53 np0005532763 ceph-mon[75752]: Creating key for client.nfs.cephfs.2.0.compute-0.bfglcy
Nov 23 15:42:53 np0005532763 ceph-mon[75752]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Nov 23 15:42:53 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e50 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:42:55 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Nov 23 15:42:55 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Nov 23 15:42:55 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.bfglcy-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Nov 23 15:42:55 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.bfglcy-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Nov 23 15:42:56 np0005532763 ceph-mon[75752]: Rados config object exists: conf-nfs.cephfs
Nov 23 15:42:56 np0005532763 ceph-mon[75752]: Creating key for client.nfs.cephfs.2.0.compute-0.bfglcy-rgw
Nov 23 15:42:56 np0005532763 ceph-mon[75752]: Bind address in nfs.cephfs.2.0.compute-0.bfglcy's ganesha conf is defaulting to empty
Nov 23 15:42:56 np0005532763 ceph-mon[75752]: Deploying daemon nfs.cephfs.2.0.compute-0.bfglcy on compute-0
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000002:nfs.cephfs.1: -2
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:42:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:42:57 : epoch 692371cb : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:42:58 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:58 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:58 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:58 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:58 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:58 np0005532763 ceph-mon[75752]: Deploying daemon haproxy.nfs.cephfs.compute-1.iwomei on compute-1
Nov 23 15:42:59 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:42:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e50 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:43:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:02 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd40000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:02 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:02 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:02 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:03 np0005532763 ceph-mon[75752]: Deploying daemon haproxy.nfs.cephfs.compute-0.uvukit on compute-0
Nov 23 15:43:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:04 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd2c001970 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e50 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:43:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:05 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd2c001970 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:06 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd10000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:06 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:06 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:06 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:07 np0005532763 ceph-mon[75752]: Deploying daemon haproxy.nfs.cephfs.compute-2.dxqoem on compute-2
Nov 23 15:43:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:07 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd1c000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:08 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd14000d00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:09 np0005532763 podman[85447]: 2025-11-23 20:43:09.077718931 +0000 UTC m=+2.842896400 container create 87f01f070ec81c1a171da634349452a48ef09c3925bcd895a376050b1acd0b46 (image=quay.io/ceph/haproxy:2.3, name=relaxed_proskuriakova)
Nov 23 15:43:09 np0005532763 podman[85447]: 2025-11-23 20:43:09.052557029 +0000 UTC m=+2.817734568 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Nov 23 15:43:09 np0005532763 systemd[1]: Started libpod-conmon-87f01f070ec81c1a171da634349452a48ef09c3925bcd895a376050b1acd0b46.scope.
Nov 23 15:43:09 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:43:09 np0005532763 podman[85447]: 2025-11-23 20:43:09.187560419 +0000 UTC m=+2.952737968 container init 87f01f070ec81c1a171da634349452a48ef09c3925bcd895a376050b1acd0b46 (image=quay.io/ceph/haproxy:2.3, name=relaxed_proskuriakova)
Nov 23 15:43:09 np0005532763 podman[85447]: 2025-11-23 20:43:09.201778687 +0000 UTC m=+2.966956176 container start 87f01f070ec81c1a171da634349452a48ef09c3925bcd895a376050b1acd0b46 (image=quay.io/ceph/haproxy:2.3, name=relaxed_proskuriakova)
Nov 23 15:43:09 np0005532763 podman[85447]: 2025-11-23 20:43:09.205441422 +0000 UTC m=+2.970618941 container attach 87f01f070ec81c1a171da634349452a48ef09c3925bcd895a376050b1acd0b46 (image=quay.io/ceph/haproxy:2.3, name=relaxed_proskuriakova)
Nov 23 15:43:09 np0005532763 relaxed_proskuriakova[85562]: 0 0
Nov 23 15:43:09 np0005532763 systemd[1]: libpod-87f01f070ec81c1a171da634349452a48ef09c3925bcd895a376050b1acd0b46.scope: Deactivated successfully.
Nov 23 15:43:09 np0005532763 podman[85447]: 2025-11-23 20:43:09.21166663 +0000 UTC m=+2.976844119 container died 87f01f070ec81c1a171da634349452a48ef09c3925bcd895a376050b1acd0b46 (image=quay.io/ceph/haproxy:2.3, name=relaxed_proskuriakova)
Nov 23 15:43:09 np0005532763 systemd[1]: var-lib-containers-storage-overlay-74a10ab94e65f2fac4daa4895607e30099ea6d055791863e5b2d8efa8f65de38-merged.mount: Deactivated successfully.
Nov 23 15:43:09 np0005532763 podman[85447]: 2025-11-23 20:43:09.262914949 +0000 UTC m=+3.028092438 container remove 87f01f070ec81c1a171da634349452a48ef09c3925bcd895a376050b1acd0b46 (image=quay.io/ceph/haproxy:2.3, name=relaxed_proskuriakova)
Nov 23 15:43:09 np0005532763 systemd[1]: libpod-conmon-87f01f070ec81c1a171da634349452a48ef09c3925bcd895a376050b1acd0b46.scope: Deactivated successfully.
Nov 23 15:43:09 np0005532763 systemd[1]: Reloading.
Nov 23 15:43:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e50 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:43:09 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:43:09 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:43:09 np0005532763 systemd[1]: Reloading.
Nov 23 15:43:09 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:43:09 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:43:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:09 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd2c002860 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:09 np0005532763 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-2.dxqoem for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:43:10 np0005532763 podman[85711]: 2025-11-23 20:43:10.342300449 +0000 UTC m=+0.066492767 container create 187afc4c1e67339be091cc4caff41c0e2aaba4673fc086f757180d516596ee6c (image=quay.io/ceph/haproxy:2.3, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem)
Nov 23 15:43:10 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03d06609f98c3ed7796c11da4c5aac035beafbe8f1301ee47c5ad0e034e87efe/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Nov 23 15:43:10 np0005532763 podman[85711]: 2025-11-23 20:43:10.315597694 +0000 UTC m=+0.039790042 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Nov 23 15:43:10 np0005532763 podman[85711]: 2025-11-23 20:43:10.409837775 +0000 UTC m=+0.134030143 container init 187afc4c1e67339be091cc4caff41c0e2aaba4673fc086f757180d516596ee6c (image=quay.io/ceph/haproxy:2.3, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem)
Nov 23 15:43:10 np0005532763 podman[85711]: 2025-11-23 20:43:10.414790777 +0000 UTC m=+0.138983095 container start 187afc4c1e67339be091cc4caff41c0e2aaba4673fc086f757180d516596ee6c (image=quay.io/ceph/haproxy:2.3, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem)
Nov 23 15:43:10 np0005532763 bash[85711]: 187afc4c1e67339be091cc4caff41c0e2aaba4673fc086f757180d516596ee6c
Nov 23 15:43:10 np0005532763 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-2.dxqoem for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:43:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [NOTICE] 326/204310 (2) : New worker #1 (4) forked
Nov 23 15:43:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:10 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd100016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:11 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd1c001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:11 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:11 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:11 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:11 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:11 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd14001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:12 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd2c002860 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:12 np0005532763 ceph-mon[75752]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Nov 23 15:43:12 np0005532763 ceph-mon[75752]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Nov 23 15:43:12 np0005532763 ceph-mon[75752]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Nov 23 15:43:12 np0005532763 ceph-mon[75752]: Deploying daemon keepalived.nfs.cephfs.compute-1.lwmzxc on compute-1
Nov 23 15:43:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:13 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd100016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:13 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]: dispatch
Nov 23 15:43:13 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e51 e51: 3 total, 3 up, 3 in
Nov 23 15:43:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:13 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd1c001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:43:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:14 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd14001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:14 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]': finished
Nov 23 15:43:14 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Nov 23 15:43:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e52 e52: 3 total, 3 up, 3 in
Nov 23 15:43:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:15 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd2c002860 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e53 e53: 3 total, 3 up, 3 in
Nov 23 15:43:15 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Nov 23 15:43:15 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Nov 23 15:43:15 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 23 15:43:15 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Nov 23 15:43:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:15 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd100016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:16 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd1c001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:16 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e54 e54: 3 total, 3 up, 3 in
Nov 23 15:43:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Nov 23 15:43:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Nov 23 15:43:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Nov 23 15:43:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Nov 23 15:43:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:16 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:16 np0005532763 ceph-mon[75752]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Nov 23 15:43:16 np0005532763 ceph-mon[75752]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Nov 23 15:43:16 np0005532763 ceph-mon[75752]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Nov 23 15:43:16 np0005532763 ceph-mon[75752]: Deploying daemon keepalived.nfs.cephfs.compute-0.spcytb on compute-0
Nov 23 15:43:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:17 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd14001820 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:17 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e55 e55: 3 total, 3 up, 3 in
Nov 23 15:43:17 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Nov 23 15:43:17 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Nov 23 15:43:17 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 23 15:43:17 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 23 15:43:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:17 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd2c002860 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:18 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd2c002860 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:18 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e56 e56: 3 total, 3 up, 3 in
Nov 23 15:43:18 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Nov 23 15:43:18 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Nov 23 15:43:18 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Nov 23 15:43:18 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Nov 23 15:43:18 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Nov 23 15:43:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:19 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd1c002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e56 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:43:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:19 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd1c002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:20 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e57 e57: 3 total, 3 up, 3 in
Nov 23 15:43:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:20 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd1c002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:20 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Nov 23 15:43:20 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 23 15:43:20 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 23 15:43:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:21 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd2c002860 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:21 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e58 e58: 3 total, 3 up, 3 in
Nov 23 15:43:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:21 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd14002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:21 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Nov 23 15:43:21 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Nov 23 15:43:21 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Nov 23 15:43:21 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Nov 23 15:43:21 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:21 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:21 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:21 np0005532763 ceph-mon[75752]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Nov 23 15:43:21 np0005532763 ceph-mon[75752]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Nov 23 15:43:21 np0005532763 ceph-mon[75752]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Nov 23 15:43:21 np0005532763 ceph-mon[75752]: Deploying daemon keepalived.nfs.cephfs.compute-2.cpybdt on compute-2
Nov 23 15:43:21 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Nov 23 15:43:22 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e59 e59: 3 total, 3 up, 3 in
Nov 23 15:43:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:22 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd10002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:23 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd1c002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:23 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd14002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:24 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e59 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:43:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:24 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd2c002860 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:24 np0005532763 podman[85832]: 2025-11-23 20:43:24.773040101 +0000 UTC m=+3.104434397 container create 4afa32954ba02a7a2f322ac3df10bdc75314a0a3a33bd6e23b12b1b0daddf64d (image=quay.io/ceph/keepalived:2.2.4, name=infallible_curran, description=keepalived for Ceph, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, build-date=2023-02-22T09:23:20, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, name=keepalived, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9)
Nov 23 15:43:24 np0005532763 podman[85832]: 2025-11-23 20:43:24.749760674 +0000 UTC m=+3.081155020 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Nov 23 15:43:24 np0005532763 systemd[1]: Started libpod-conmon-4afa32954ba02a7a2f322ac3df10bdc75314a0a3a33bd6e23b12b1b0daddf64d.scope.
Nov 23 15:43:24 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:43:24 np0005532763 podman[85832]: 2025-11-23 20:43:24.895360057 +0000 UTC m=+3.226754413 container init 4afa32954ba02a7a2f322ac3df10bdc75314a0a3a33bd6e23b12b1b0daddf64d (image=quay.io/ceph/keepalived:2.2.4, name=infallible_curran, vcs-type=git, architecture=x86_64, name=keepalived, description=keepalived for Ceph, io.buildah.version=1.28.2, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, release=1793, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Nov 23 15:43:24 np0005532763 podman[85832]: 2025-11-23 20:43:24.908672149 +0000 UTC m=+3.240066445 container start 4afa32954ba02a7a2f322ac3df10bdc75314a0a3a33bd6e23b12b1b0daddf64d (image=quay.io/ceph/keepalived:2.2.4, name=infallible_curran, vcs-type=git, name=keepalived, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, vendor=Red Hat, Inc., distribution-scope=public)
Nov 23 15:43:24 np0005532763 podman[85832]: 2025-11-23 20:43:24.912808487 +0000 UTC m=+3.244202823 container attach 4afa32954ba02a7a2f322ac3df10bdc75314a0a3a33bd6e23b12b1b0daddf64d (image=quay.io/ceph/keepalived:2.2.4, name=infallible_curran, release=1793, io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, com.redhat.component=keepalived-container, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, vcs-type=git, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Nov 23 15:43:24 np0005532763 infallible_curran[85927]: 0 0
Nov 23 15:43:24 np0005532763 systemd[1]: libpod-4afa32954ba02a7a2f322ac3df10bdc75314a0a3a33bd6e23b12b1b0daddf64d.scope: Deactivated successfully.
Nov 23 15:43:24 np0005532763 podman[85832]: 2025-11-23 20:43:24.92058472 +0000 UTC m=+3.251979006 container died 4afa32954ba02a7a2f322ac3df10bdc75314a0a3a33bd6e23b12b1b0daddf64d (image=quay.io/ceph/keepalived:2.2.4, name=infallible_curran, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, vendor=Red Hat, Inc., name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, com.redhat.component=keepalived-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793)
Nov 23 15:43:24 np0005532763 systemd[1]: var-lib-containers-storage-overlay-b2fe9176324ab76d6b04d718e2372bc01b40ee8a6c5bd079c132a81c218914f7-merged.mount: Deactivated successfully.
Nov 23 15:43:24 np0005532763 podman[85832]: 2025-11-23 20:43:24.971747227 +0000 UTC m=+3.303141513 container remove 4afa32954ba02a7a2f322ac3df10bdc75314a0a3a33bd6e23b12b1b0daddf64d (image=quay.io/ceph/keepalived:2.2.4, name=infallible_curran, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64)
Nov 23 15:43:24 np0005532763 systemd[1]: libpod-conmon-4afa32954ba02a7a2f322ac3df10bdc75314a0a3a33bd6e23b12b1b0daddf64d.scope: Deactivated successfully.
Nov 23 15:43:25 np0005532763 systemd[1]: Reloading.
Nov 23 15:43:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:25 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd10003430 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:25 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:43:25 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:43:25 np0005532763 systemd[1]: Reloading.
Nov 23 15:43:25 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:43:25 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:43:25 np0005532763 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-2.cpybdt for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:43:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:25 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd1c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:26 np0005532763 podman[86072]: 2025-11-23 20:43:26.10536825 +0000 UTC m=+0.082707492 container create f83166e24f35928d8e85c6352ec69e598c685dd22eb2d34bc93aec691f658844 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, version=2.2.4, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., description=keepalived for Ceph, build-date=2023-02-22T09:23:20, vcs-type=git, distribution-scope=public)
Nov 23 15:43:26 np0005532763 podman[86072]: 2025-11-23 20:43:26.073174887 +0000 UTC m=+0.050514199 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Nov 23 15:43:26 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e561d0e7779ae12cb72474b473f6e6c807f04f3a6041f6e7fb78e5330a170e4/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:43:26 np0005532763 podman[86072]: 2025-11-23 20:43:26.18875717 +0000 UTC m=+0.166096472 container init f83166e24f35928d8e85c6352ec69e598c685dd22eb2d34bc93aec691f658844 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, architecture=x86_64, version=2.2.4, vendor=Red Hat, Inc., io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, vcs-type=git, name=keepalived)
Nov 23 15:43:26 np0005532763 podman[86072]: 2025-11-23 20:43:26.19956194 +0000 UTC m=+0.176901192 container start f83166e24f35928d8e85c6352ec69e598c685dd22eb2d34bc93aec691f658844 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt, release=1793, architecture=x86_64, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, name=keepalived, version=2.2.4, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Nov 23 15:43:26 np0005532763 bash[86072]: f83166e24f35928d8e85c6352ec69e598c685dd22eb2d34bc93aec691f658844
Nov 23 15:43:26 np0005532763 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-2.cpybdt for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:43:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:26 2025: Starting Keepalived v2.2.4 (08/21,2021)
Nov 23 15:43:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:26 2025: Running on Linux 5.14.0-639.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Sat Nov 15 10:30:41 UTC 2025 (built for Linux 5.14.0)
Nov 23 15:43:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:26 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Nov 23 15:43:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:26 2025: Configuration file /etc/keepalived/keepalived.conf
Nov 23 15:43:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:26 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Nov 23 15:43:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:26 2025: Starting VRRP child process, pid=4
Nov 23 15:43:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:26 2025: Startup complete
Nov 23 15:43:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:26 2025: (VI_0) Entering BACKUP STATE (init)
Nov 23 15:43:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:26 2025: VRRP_Script(check_backend) succeeded
Nov 23 15:43:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:26 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd14002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:26 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:26 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:26 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:26 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:26 np0005532763 ceph-mon[75752]: Deploying daemon alertmanager.compute-0 on compute-0
Nov 23 15:43:26 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 23 15:43:26 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 23 15:43:26 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 23 15:43:26 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]: dispatch
Nov 23 15:43:26 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 23 15:43:26 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Nov 23 15:43:26 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 23 15:43:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:27 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd2c002860 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:27 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e60 e60: 3 total, 3 up, 3 in
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[11.16( empty local-lis/les=0/0 n=0 ec=57/46 lis/c=57/57 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[11.17( empty local-lis/les=0/0 n=0 ec=57/46 lis/c=57/57 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[8.15( empty local-lis/les=0/0 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[8.3( empty local-lis/les=0/0 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[6.1( empty local-lis/les=0/0 n=0 ec=53/18 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[6.d( empty local-lis/les=0/0 n=0 ec=53/18 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[8.f( empty local-lis/les=0/0 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[11.13( empty local-lis/les=0/0 n=0 ec=57/46 lis/c=57/57 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[11.a( empty local-lis/les=0/0 n=0 ec=57/46 lis/c=57/57 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[9.9( empty local-lis/les=0/0 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[9.8( empty local-lis/les=0/0 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[8.a( empty local-lis/les=0/0 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[8.9( empty local-lis/les=0/0 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[6.7( empty local-lis/les=0/0 n=0 ec=53/18 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[9.b( empty local-lis/les=0/0 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[11.e( empty local-lis/les=0/0 n=0 ec=57/46 lis/c=57/57 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[6.3( empty local-lis/les=0/0 n=0 ec=53/18 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[8.d( empty local-lis/les=0/0 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[11.8( empty local-lis/les=0/0 n=0 ec=57/46 lis/c=57/57 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[6.5( empty local-lis/les=0/0 n=0 ec=53/18 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[8.c( empty local-lis/les=0/0 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[8.b( empty local-lis/les=0/0 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[6.f( empty local-lis/les=0/0 n=0 ec=53/18 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[6.9( empty local-lis/les=0/0 n=0 ec=53/18 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[11.3( empty local-lis/les=0/0 n=0 ec=57/46 lis/c=57/57 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[10.15( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[11.19( empty local-lis/les=0/0 n=0 ec=57/46 lis/c=57/57 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[12.11( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[9.5( empty local-lis/les=0/0 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[10.17( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[9.18( empty local-lis/les=0/0 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[9.1d( empty local-lis/les=0/0 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[12.13( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[8.1c( empty local-lis/les=0/0 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[8.6( empty local-lis/les=0/0 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[9.7( empty local-lis/les=0/0 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[9.13( empty local-lis/les=0/0 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[6.b( empty local-lis/les=0/0 n=0 ec=53/18 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[9.3( empty local-lis/les=0/0 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[12.4( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[8.2( empty local-lis/les=0/0 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[10.13( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[8.5( empty local-lis/les=0/0 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[10.f( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[12.9( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[10.d( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[8.16( empty local-lis/les=0/0 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[7.5( empty local-lis/les=0/0 n=0 ec=53/20 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[10.3( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[10.b( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[8.11( empty local-lis/les=0/0 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[12.2( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[9.17( empty local-lis/les=0/0 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[10.5( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[9.16( empty local-lis/les=0/0 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[12.3( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[8.1f( empty local-lis/les=0/0 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[7.14( empty local-lis/les=0/0 n=0 ec=53/20 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[10.19( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[7.11( empty local-lis/les=0/0 n=0 ec=53/20 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[12.1a( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[12.18( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[10.1d( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[7.1f( empty local-lis/les=0/0 n=0 ec=53/20 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[10.1f( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[7.1d( empty local-lis/les=0/0 n=0 ec=53/20 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[10.9( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[7.a( empty local-lis/les=0/0 n=0 ec=53/20 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[12.7( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[10.1( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[10.7( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[7.16( empty local-lis/les=0/0 n=0 ec=53/20 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[12.1e( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[12.17( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[12.1d( empty local-lis/les=0/0 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[10.1b( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 60 pg[10.11( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:27 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd10003430 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:28 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 23 15:43:28 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 23 15:43:28 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 23 15:43:28 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]': finished
Nov 23 15:43:28 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 23 15:43:28 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Nov 23 15:43:28 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 23 15:43:28 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e61 e61: 3 total, 3 up, 3 in
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.1f( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.19( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.1f( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.1d( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.1d( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.13( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.13( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.19( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.1( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.1( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.17( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.17( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.15( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.15( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.f( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.f( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.7( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.7( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.1b( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.1b( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.3( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.3( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.b( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.b( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.d( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.d( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.5( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.5( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.11( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.9( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.9( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[10.11( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=61) [2]/[0] r=-1 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[7.14( empty local-lis/les=60/61 n=0 ec=53/20 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[12.18( empty local-lis/les=60/61 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[8.1c( v 50'45 (0'0,50'45] local-lis/les=60/61 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=50'45 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[9.7( v 43'12 (0'0,43'12] local-lis/les=60/61 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=43'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[8.6( v 50'45 (0'0,50'45] local-lis/les=60/61 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=50'45 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[8.c( v 50'45 (0'0,50'45] local-lis/les=60/61 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=50'45 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[8.1f( v 50'45 (0'0,50'45] local-lis/les=60/61 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=50'45 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[12.2( empty local-lis/les=60/61 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[9.1d( v 43'12 (0'0,43'12] local-lis/les=60/61 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=43'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[8.11( v 50'45 (0'0,50'45] local-lis/les=60/61 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=50'45 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[12.7( empty local-lis/les=60/61 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[8.3( v 50'45 (0'0,50'45] local-lis/les=60/61 n=1 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=50'45 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[12.11( empty local-lis/les=60/61 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[6.d( v 49'39 lc 48'13 (0'0,49'39] local-lis/les=60/61 n=1 ec=53/18 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=49'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[12.13( empty local-lis/les=60/61 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[9.16( v 43'12 (0'0,43'12] local-lis/les=60/61 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=43'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[12.9( v 59'1 lc 0'0 (0'0,59'1] local-lis/les=60/61 n=1 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=59'1 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[8.d( v 50'45 lc 50'18 (0'0,50'45] local-lis/les=60/61 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=50'45 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[6.3( v 49'39 lc 0'0 (0'0,49'39] local-lis/les=60/61 n=2 ec=53/18 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=49'39 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[11.e( v 59'57 lc 59'56 (0'0,59'57] local-lis/les=60/61 n=0 ec=57/46 lis/c=57/57 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=59'57 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[9.3( v 43'12 lc 43'2 (0'0,43'12] local-lis/les=60/61 n=1 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=43'12 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[8.2( v 50'45 (0'0,50'45] local-lis/les=60/61 n=1 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=50'45 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[7.1d( empty local-lis/les=60/61 n=0 ec=53/20 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[8.5( v 50'45 (0'0,50'45] local-lis/les=60/61 n=1 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=50'45 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[9.13( v 43'12 (0'0,43'12] local-lis/les=60/61 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=43'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[6.b( v 49'39 lc 0'0 (0'0,49'39] local-lis/les=60/61 n=1 ec=53/18 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=49'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[12.1d( empty local-lis/les=60/61 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[12.4( empty local-lis/les=60/61 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[7.16( empty local-lis/les=60/61 n=0 ec=53/20 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[9.18( v 43'12 lc 0'0 (0'0,43'12] local-lis/les=60/61 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=43'12 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[12.1a( empty local-lis/les=60/61 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[7.a( empty local-lis/les=60/61 n=0 ec=53/20 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[7.11( empty local-lis/les=60/61 n=0 ec=53/20 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[6.f( v 49'39 lc 48'1 (0'0,49'39] local-lis/les=60/61 n=3 ec=53/18 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=49'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[11.19( v 47'48 (0'0,47'48] local-lis/les=60/61 n=0 ec=57/46 lis/c=57/57 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=47'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[9.5( v 43'12 (0'0,43'12] local-lis/les=60/61 n=1 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=43'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[12.1e( empty local-lis/les=60/61 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[8.9( v 50'45 (0'0,50'45] local-lis/les=60/61 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=50'45 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[8.16( v 50'45 lc 0'0 (0'0,50'45] local-lis/les=60/61 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=50'45 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[11.a( v 47'48 (0'0,47'48] local-lis/les=60/61 n=0 ec=57/46 lis/c=57/57 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=47'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[9.17( v 43'12 (0'0,43'12] local-lis/les=60/61 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=43'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[9.8( v 43'12 (0'0,43'12] local-lis/les=60/61 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=43'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[12.17( empty local-lis/les=60/61 n=0 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[6.7( v 49'39 lc 48'21 (0'0,49'39] local-lis/les=60/61 n=1 ec=53/18 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=49'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[8.f( v 50'45 (0'0,50'45] local-lis/les=60/61 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=50'45 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[6.1( v 49'39 (0'0,49'39] local-lis/les=60/61 n=2 ec=53/18 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=49'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[6.9( v 49'39 (0'0,49'39] local-lis/les=60/61 n=0 ec=53/18 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=49'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[12.3( v 59'1 lc 0'0 (0'0,59'1] local-lis/les=60/61 n=1 ec=58/48 lis/c=58/58 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[58,60)/1 crt=59'1 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[11.3( v 59'57 lc 59'56 (0'0,59'57] local-lis/les=60/61 n=1 ec=57/46 lis/c=57/57 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=59'57 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[11.8( v 47'48 (0'0,47'48] local-lis/les=60/61 n=1 ec=57/46 lis/c=57/57 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=47'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[6.5( v 49'39 lc 48'11 (0'0,49'39] local-lis/les=60/61 n=2 ec=53/18 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=49'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[8.b( v 50'45 (0'0,50'45] local-lis/les=60/61 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=50'45 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[7.5( empty local-lis/les=60/61 n=0 ec=53/20 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[7.1f( empty local-lis/les=60/61 n=0 ec=53/20 lis/c=53/53 les/c/f=54/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[9.b( v 43'12 (0'0,43'12] local-lis/les=60/61 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=43'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[11.13( v 47'48 (0'0,47'48] local-lis/les=60/61 n=0 ec=57/46 lis/c=57/57 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=47'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[8.15( v 50'45 (0'0,50'45] local-lis/les=60/61 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=50'45 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[9.9( v 43'12 (0'0,43'12] local-lis/les=60/61 n=0 ec=55/42 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=43'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[11.16( v 47'48 (0'0,47'48] local-lis/les=60/61 n=0 ec=57/46 lis/c=57/57 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=47'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[11.17( v 47'48 (0'0,47'48] local-lis/les=60/61 n=0 ec=57/46 lis/c=57/57 les/c/f=59/59/0 sis=60) [2] r=0 lpr=60 pi=[57,60)/1 crt=47'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 61 pg[8.a( v 50'45 (0'0,50'45] local-lis/les=60/61 n=0 ec=55/39 lis/c=55/55 les/c/f=56/56/0 sis=60) [2] r=0 lpr=60 pi=[55,60)/1 crt=50'45 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:28 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd1c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:29 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:29 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]: dispatch
Nov 23 15:43:29 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Nov 23 15:43:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:29 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd14002cb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e61 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:43:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e62 e62: 3 total, 3 up, 3 in
Nov 23 15:43:29 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 62 pg[6.b( v 49'39 lc 0'0 (0'0,49'39] local-lis/les=60/61 n=1 ec=53/18 lis/c=60/53 les/c/f=61/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=49'39 mlcod 0'0 active+recovering rops=1 m=1 mbc={255={(0+1)=1}}] scrubber<NotActive>: update_scrub_job !!! primary but not scheduled! 
Nov 23 15:43:29 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 62 pg[6.f( v 49'39 lc 48'1 (0'0,49'39] local-lis/les=60/61 n=3 ec=53/18 lis/c=60/53 les/c/f=61/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=49'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=3 mbc={255={(0+1)=3}}] scrubber<NotActive>: update_scrub_job !!! primary but not scheduled! 
Nov 23 15:43:29 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 62 pg[6.7( v 49'39 lc 48'21 (0'0,49'39] local-lis/les=60/61 n=1 ec=53/18 lis/c=60/53 les/c/f=61/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=49'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=1 mbc={255={(0+1)=1}}] scrubber<NotActive>: update_scrub_job !!! primary but not scheduled! 
Nov 23 15:43:29 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 62 pg[6.5( v 49'39 lc 48'11 (0'0,49'39] local-lis/les=60/61 n=2 ec=53/18 lis/c=60/53 les/c/f=61/54/0 sis=60) [2] r=0 lpr=60 pi=[53,60)/1 crt=49'39 lcod 0'0 mlcod 0'0 active+recovery_wait+degraded m=2 mbc={255={(0+1)=2}}] scrubber<NotActive>: update_scrub_job !!! primary but not scheduled! 
Nov 23 15:43:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:29 2025: (VI_0) Entering MASTER STATE
Nov 23 15:43:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:29 2025: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Nov 23 15:43:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:29 2025: (VI_0) Entering BACKUP STATE
Nov 23 15:43:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:29 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd2c002860 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:30 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]': finished
Nov 23 15:43:30 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Nov 23 15:43:30 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:30 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e63 e63: 3 total, 3 up, 3 in
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.19( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.19( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.1d( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.1d( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.13( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.13( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.1( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.1( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.f( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.f( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.15( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.15( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.b( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.b( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.d( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.d( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.5( v 62'998 (0'0,62'998] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 luod=0'0 crt=59'994 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.5( v 62'998 (0'0,62'998] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=59'994 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.17( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 63 pg[10.17( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:30 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd10003430 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 6.7 scrub starts
Nov 23 15:43:30 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 6.7 scrub ok
Nov 23 15:43:31 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:31 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:31 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:31 np0005532763 ceph-mon[75752]: Regenerating cephadm self-signed grafana TLS certificates
Nov 23 15:43:31 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:31 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:31 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Nov 23 15:43:31 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:31 np0005532763 ceph-mon[75752]: Deploying daemon grafana.compute-0 on compute-0
Nov 23 15:43:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:31 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd1c004050 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:31 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e64 e64: 3 total, 3 up, 3 in
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.7( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.7( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.1b( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.1b( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.3( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.11( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.3( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.11( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.9( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.9( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.19( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.1d( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.1( v 50'991 (0'0,50'991] local-lis/les=63/64 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.13( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.15( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.17( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.f( v 50'991 (0'0,50'991] local-lis/les=63/64 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.b( v 50'991 (0'0,50'991] local-lis/les=63/64 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.d( v 50'991 (0'0,50'991] local-lis/les=63/64 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 64 pg[10.5( v 62'998 (0'0,62'998] local-lis/les=63/64 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=63) [2] r=0 lpr=63 pi=[57,63)/1 crt=62'998 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.9 scrub starts
Nov 23 15:43:31 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.9 scrub ok
Nov 23 15:43:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:31 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd14003db0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:32 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd2c002860 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:32 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 11.e scrub starts
Nov 23 15:43:32 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e65 e65: 3 total, 3 up, 3 in
Nov 23 15:43:32 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 11.e scrub ok
Nov 23 15:43:32 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 65 pg[10.9( v 50'991 (0'0,50'991] local-lis/les=64/65 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:32 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 65 pg[10.11( v 50'991 (0'0,50'991] local-lis/les=64/65 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:32 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 65 pg[10.1b( v 50'991 (0'0,50'991] local-lis/les=64/65 n=5 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:32 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 65 pg[10.7( v 50'991 (0'0,50'991] local-lis/les=64/65 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:32 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 65 pg[10.3( v 50'991 (0'0,50'991] local-lis/les=64/65 n=6 ec=57/44 lis/c=61/57 les/c/f=62/58/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:33 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd10003430 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:33 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.d deep-scrub starts
Nov 23 15:43:33 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.d deep-scrub ok
Nov 23 15:43:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:33 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd1c004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:34 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e65 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:43:34 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Nov 23 15:43:34 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Nov 23 15:43:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:34 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd14003db0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:43:35 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e66 e66: 3 total, 3 up, 3 in
Nov 23 15:43:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[85297]: 23/11/2025 20:43:35 : epoch 692371cb : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efd34000df0 fd 47 proxy ignored for local
Nov 23 15:43:35 np0005532763 kernel: ganesha.nfsd[86099]: segfault at 50 ip 00007efdebeae32e sp 00007efda8ff8210 error 4 in libntirpc.so.5.8[7efdebe93000+2c000] likely on CPU 5 (core 0, socket 5)
Nov 23 15:43:35 np0005532763 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Nov 23 15:43:35 np0005532763 systemd[1]: Created slice Slice /system/systemd-coredump.
Nov 23 15:43:35 np0005532763 systemd[1]: Started Process Core Dump (PID 86101/UID 0).
Nov 23 15:43:35 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.18 scrub starts
Nov 23 15:43:35 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.18 scrub ok
Nov 23 15:43:36 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.c scrub starts
Nov 23 15:43:36 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.c scrub ok
Nov 23 15:43:37 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.7 scrub starts
Nov 23 15:43:37 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.7 scrub ok
Nov 23 15:43:37 np0005532763 systemd-coredump[86102]: Process 85301 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 56:#012#0  0x00007efdebeae32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Nov 23 15:43:38 np0005532763 systemd[1]: systemd-coredump@0-86101-0.service: Deactivated successfully.
Nov 23 15:43:38 np0005532763 systemd[1]: systemd-coredump@0-86101-0.service: Consumed 1.266s CPU time.
Nov 23 15:43:38 np0005532763 podman[86108]: 2025-11-23 20:43:38.141745288 +0000 UTC m=+0.040511862 container died bed36d32aed75483423ddad867bd3597890ddea7438fb907c2da911b493cc00a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:43:38 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.1c scrub starts
Nov 23 15:43:38 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.1c scrub ok
Nov 23 15:43:38 np0005532763 systemd[1]: var-lib-containers-storage-overlay-50f777721e929df053d35c3f357207cad5e44090b2a32521c900a1ca0dec6f96-merged.mount: Deactivated successfully.
Nov 23 15:43:38 np0005532763 podman[86108]: 2025-11-23 20:43:38.507972897 +0000 UTC m=+0.406739421 container remove bed36d32aed75483423ddad867bd3597890ddea7438fb907c2da911b493cc00a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 15:43:38 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Main process exited, code=exited, status=139/n/a
Nov 23 15:43:38 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Failed with result 'exit-code'.
Nov 23 15:43:38 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.882s CPU time.
Nov 23 15:43:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e67 e67: 3 total, 3 up, 3 in
Nov 23 15:43:39 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 67 pg[6.3( v 49'39 (0'0,49'39] local-lis/les=60/61 n=2 ec=53/18 lis/c=60/60 les/c/f=61/61/0 sis=67 pruub=13.226939201s) [0] r=-1 lpr=67 pi=[60,67)/1 crt=49'39 mlcod 49'39 active pruub 132.888336182s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:39 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 67 pg[6.3( v 49'39 (0'0,49'39] local-lis/les=60/61 n=2 ec=53/18 lis/c=60/60 les/c/f=61/61/0 sis=67 pruub=13.226812363s) [0] r=-1 lpr=67 pi=[60,67)/1 crt=49'39 mlcod 0'0 unknown NOTIFY pruub 132.888336182s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:39 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 67 pg[6.b( v 49'39 (0'0,49'39] local-lis/les=60/61 n=1 ec=53/18 lis/c=60/60 les/c/f=61/62/0 sis=67 pruub=13.226820946s) [0] r=-1 lpr=67 pi=[60,67)/1 crt=49'39 mlcod 49'39 active pruub 132.888504028s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:39 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 67 pg[6.b( v 49'39 (0'0,49'39] local-lis/les=60/61 n=1 ec=53/18 lis/c=60/60 les/c/f=61/62/0 sis=67 pruub=13.226753235s) [0] r=-1 lpr=67 pi=[60,67)/1 crt=49'39 mlcod 0'0 unknown NOTIFY pruub 132.888504028s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:39 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 67 pg[6.f( v 49'39 (0'0,49'39] local-lis/les=60/61 n=3 ec=53/18 lis/c=60/60 les/c/f=61/62/0 sis=67 pruub=13.226746559s) [0] r=-1 lpr=67 pi=[60,67)/1 crt=49'39 mlcod 49'39 active pruub 132.888671875s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:39 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 67 pg[6.f( v 49'39 (0'0,49'39] local-lis/les=60/61 n=3 ec=53/18 lis/c=60/60 les/c/f=61/62/0 sis=67 pruub=13.226655960s) [0] r=-1 lpr=67 pi=[60,67)/1 crt=49'39 mlcod 0'0 unknown NOTIFY pruub 132.888671875s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:39 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 67 pg[6.7( v 49'39 (0'0,49'39] local-lis/les=60/61 n=1 ec=53/18 lis/c=60/60 les/c/f=61/62/0 sis=67 pruub=13.226519585s) [0] r=-1 lpr=67 pi=[60,67)/1 crt=49'39 mlcod 49'39 active pruub 132.888793945s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:39 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 67 pg[6.7( v 49'39 (0'0,49'39] local-lis/les=60/61 n=1 ec=53/18 lis/c=60/60 les/c/f=61/62/0 sis=67 pruub=13.226462364s) [0] r=-1 lpr=67 pi=[60,67)/1 crt=49'39 mlcod 0'0 unknown NOTIFY pruub 132.888793945s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:43:39 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]: dispatch
Nov 23 15:43:39 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Nov 23 15:43:39 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.2 scrub starts
Nov 23 15:43:39 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.2 scrub ok
Nov 23 15:43:40 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e68 e68: 3 total, 3 up, 3 in
Nov 23 15:43:40 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]: dispatch
Nov 23 15:43:40 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Nov 23 15:43:40 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]': finished
Nov 23 15:43:40 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Nov 23 15:43:40 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]': finished
Nov 23 15:43:40 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Nov 23 15:43:40 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.1f scrub starts
Nov 23 15:43:40 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.1f scrub ok
Nov 23 15:43:41 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.11 scrub starts
Nov 23 15:43:41 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.11 scrub ok
Nov 23 15:43:41 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:41 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:41 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:41 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:41 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:42 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.7 scrub starts
Nov 23 15:43:42 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.7 scrub ok
Nov 23 15:43:42 np0005532763 ceph-mon[75752]: Deploying daemon haproxy.rgw.default.compute-0.pteysg on compute-0
Nov 23 15:43:42 np0005532763 ceph-mon[75752]: Health check failed: Degraded data redundancy: 3/226 objects degraded (1.327%), 2 pgs degraded (PG_DEGRADED)
Nov 23 15:43:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/204343 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:43:43 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.11 scrub starts
Nov 23 15:43:43 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.11 scrub ok
Nov 23 15:43:43 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:43 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:43 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:43 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:43 np0005532763 podman[86248]: 2025-11-23 20:43:43.48353898 +0000 UTC m=+0.062582214 container create 95a8b81ecfef290013c31349c4870133b9ac9498e33f5651685896eea11ebe02 (image=quay.io/ceph/haproxy:2.3, name=thirsty_lehmann)
Nov 23 15:43:43 np0005532763 systemd[1]: Started libpod-conmon-95a8b81ecfef290013c31349c4870133b9ac9498e33f5651685896eea11ebe02.scope.
Nov 23 15:43:43 np0005532763 podman[86248]: 2025-11-23 20:43:43.456671453 +0000 UTC m=+0.035714757 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Nov 23 15:43:43 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:43:43 np0005532763 podman[86248]: 2025-11-23 20:43:43.604078387 +0000 UTC m=+0.183121661 container init 95a8b81ecfef290013c31349c4870133b9ac9498e33f5651685896eea11ebe02 (image=quay.io/ceph/haproxy:2.3, name=thirsty_lehmann)
Nov 23 15:43:43 np0005532763 podman[86248]: 2025-11-23 20:43:43.615511429 +0000 UTC m=+0.194554663 container start 95a8b81ecfef290013c31349c4870133b9ac9498e33f5651685896eea11ebe02 (image=quay.io/ceph/haproxy:2.3, name=thirsty_lehmann)
Nov 23 15:43:43 np0005532763 podman[86248]: 2025-11-23 20:43:43.619545762 +0000 UTC m=+0.198589016 container attach 95a8b81ecfef290013c31349c4870133b9ac9498e33f5651685896eea11ebe02 (image=quay.io/ceph/haproxy:2.3, name=thirsty_lehmann)
Nov 23 15:43:43 np0005532763 thirsty_lehmann[86265]: 0 0
Nov 23 15:43:43 np0005532763 systemd[1]: libpod-95a8b81ecfef290013c31349c4870133b9ac9498e33f5651685896eea11ebe02.scope: Deactivated successfully.
Nov 23 15:43:43 np0005532763 podman[86248]: 2025-11-23 20:43:43.624125911 +0000 UTC m=+0.203169145 container died 95a8b81ecfef290013c31349c4870133b9ac9498e33f5651685896eea11ebe02 (image=quay.io/ceph/haproxy:2.3, name=thirsty_lehmann)
Nov 23 15:43:43 np0005532763 systemd[1]: var-lib-containers-storage-overlay-cc2e07a312d3bafc968288d332e7cd8f0df3f88c8db3e7f4ef032635fb59bbfc-merged.mount: Deactivated successfully.
Nov 23 15:43:43 np0005532763 podman[86248]: 2025-11-23 20:43:43.686435617 +0000 UTC m=+0.265478851 container remove 95a8b81ecfef290013c31349c4870133b9ac9498e33f5651685896eea11ebe02 (image=quay.io/ceph/haproxy:2.3, name=thirsty_lehmann)
Nov 23 15:43:43 np0005532763 systemd[1]: libpod-conmon-95a8b81ecfef290013c31349c4870133b9ac9498e33f5651685896eea11ebe02.scope: Deactivated successfully.
Nov 23 15:43:43 np0005532763 systemd[1]: Reloading.
Nov 23 15:43:43 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:43:43 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:43:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:43:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000057s ======
Nov 23 15:43:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:43:43.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Nov 23 15:43:44 np0005532763 systemd[1]: Reloading.
Nov 23 15:43:44 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:43:44 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:43:44 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.3 scrub starts
Nov 23 15:43:44 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.3 scrub ok
Nov 23 15:43:44 np0005532763 systemd[1]: Starting Ceph haproxy.rgw.default.compute-2.tmivar for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: Deploying daemon haproxy.rgw.default.compute-2.tmivar on compute-2
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:44.750721) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930624750870, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7138, "num_deletes": 255, "total_data_size": 19776566, "memory_usage": 20664880, "flush_reason": "Manual Compaction"}
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Nov 23 15:43:44 np0005532763 podman[86414]: 2025-11-23 20:43:44.760439357 +0000 UTC m=+0.076575879 container create d33f3e9085d98a31dd83e3c506a3eb379af0b312f1320de4afbefafd99b8da13 (image=quay.io/ceph/haproxy:2.3, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-rgw-default-compute-2-tmivar)
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930624819739, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 12675126, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 7143, "table_properties": {"data_size": 12647625, "index_size": 17658, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8773, "raw_key_size": 84885, "raw_average_key_size": 24, "raw_value_size": 12579835, "raw_average_value_size": 3600, "num_data_blocks": 779, "num_entries": 3494, "num_filter_entries": 3494, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 1763930464, "file_creation_time": 1763930624, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 69077 microseconds, and 40668 cpu microseconds.
Nov 23 15:43:44 np0005532763 podman[86414]: 2025-11-23 20:43:44.729219987 +0000 UTC m=+0.045356589 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:44.819805) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 12675126 bytes OK
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:44.819831) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:44.822483) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:44.822514) EVENT_LOG_v1 {"time_micros": 1763930624822505, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:44.822537) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 19738717, prev total WAL file size 19738717, number of live WAL files 2.
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:44.828919) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323531' seq:0, type:0; will stop at (end)
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(12MB) 8(1648B)]
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930624829082, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 12676774, "oldest_snapshot_seqno": -1}
Nov 23 15:43:44 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e5488d7ced6235a71d44a2e83113ae4f08376781b46c4cde3b1081d16ca0aa5/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Nov 23 15:43:44 np0005532763 podman[86414]: 2025-11-23 20:43:44.851832402 +0000 UTC m=+0.167968974 container init d33f3e9085d98a31dd83e3c506a3eb379af0b312f1320de4afbefafd99b8da13 (image=quay.io/ceph/haproxy:2.3, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-rgw-default-compute-2-tmivar)
Nov 23 15:43:44 np0005532763 podman[86414]: 2025-11-23 20:43:44.860839515 +0000 UTC m=+0.176976047 container start d33f3e9085d98a31dd83e3c506a3eb379af0b312f1320de4afbefafd99b8da13 (image=quay.io/ceph/haproxy:2.3, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-rgw-default-compute-2-tmivar)
Nov 23 15:43:44 np0005532763 bash[86414]: d33f3e9085d98a31dd83e3c506a3eb379af0b312f1320de4afbefafd99b8da13
Nov 23 15:43:44 np0005532763 systemd[1]: Started Ceph haproxy.rgw.default.compute-2.tmivar for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:43:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-rgw-default-compute-2-tmivar[86431]: [NOTICE] 326/204344 (2) : New worker #1 (4) forked
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3242 keys, 12671313 bytes, temperature: kUnknown
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930624902170, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 12671313, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12644480, "index_size": 17635, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8133, "raw_key_size": 81436, "raw_average_key_size": 25, "raw_value_size": 12579893, "raw_average_value_size": 3880, "num_data_blocks": 778, "num_entries": 3242, "num_filter_entries": 3242, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763930624, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:44.902491) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 12671313 bytes
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:44.903871) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 173.2 rd, 173.1 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(12.1, 0.0 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3499, records dropped: 257 output_compression: NoCompression
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:44.903903) EVENT_LOG_v1 {"time_micros": 1763930624903888, "job": 4, "event": "compaction_finished", "compaction_time_micros": 73188, "compaction_time_cpu_micros": 45626, "output_level": 6, "num_output_files": 1, "total_output_size": 12671313, "num_input_records": 3499, "num_output_records": 3242, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930624908077, "job": 4, "event": "table_file_deletion", "file_number": 14}
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930624908145, "job": 4, "event": "table_file_deletion", "file_number": 8}
Nov 23 15:43:44 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:44.828794) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:43:45 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.6 deep-scrub starts
Nov 23 15:43:45 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.6 deep-scrub ok
Nov 23 15:43:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:45 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:43:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:43:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:43:46.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:43:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:43:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:43:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:43:46.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:43:46 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.1d deep-scrub starts
Nov 23 15:43:46 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.1d deep-scrub ok
Nov 23 15:43:46 np0005532763 ceph-mon[75752]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Nov 23 15:43:46 np0005532763 ceph-mon[75752]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Nov 23 15:43:46 np0005532763 ceph-mon[75752]: Deploying daemon keepalived.rgw.default.compute-0.xymmfk on compute-0
Nov 23 15:43:46 np0005532763 systemd-logind[830]: New session 36 of user zuul.
Nov 23 15:43:46 np0005532763 systemd[1]: Started Session 36 of User zuul.
Nov 23 15:43:47 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.13 scrub starts
Nov 23 15:43:47 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.13 scrub ok
Nov 23 15:43:47 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:47 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:47 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:47 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]: dispatch
Nov 23 15:43:47 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Nov 23 15:43:47 np0005532763 podman[86692]: 2025-11-23 20:43:47.695061378 +0000 UTC m=+0.067150383 container create a26ae8747820981968333b66179c86853c1cd6ba21b0db67cef90646d9172072 (image=quay.io/ceph/keepalived:2.2.4, name=pedantic_northcutt, description=keepalived for Ceph, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, version=2.2.4, vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, build-date=2023-02-22T09:23:20)
Nov 23 15:43:47 np0005532763 systemd[1]: Started libpod-conmon-a26ae8747820981968333b66179c86853c1cd6ba21b0db67cef90646d9172072.scope.
Nov 23 15:43:47 np0005532763 podman[86692]: 2025-11-23 20:43:47.664016383 +0000 UTC m=+0.036105458 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Nov 23 15:43:47 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:43:47 np0005532763 podman[86692]: 2025-11-23 20:43:47.79594031 +0000 UTC m=+0.168029395 container init a26ae8747820981968333b66179c86853c1cd6ba21b0db67cef90646d9172072 (image=quay.io/ceph/keepalived:2.2.4, name=pedantic_northcutt, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9, release=1793, version=2.2.4, architecture=x86_64, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph.)
Nov 23 15:43:47 np0005532763 podman[86692]: 2025-11-23 20:43:47.81404978 +0000 UTC m=+0.186138775 container start a26ae8747820981968333b66179c86853c1cd6ba21b0db67cef90646d9172072 (image=quay.io/ceph/keepalived:2.2.4, name=pedantic_northcutt, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, name=keepalived, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph)
Nov 23 15:43:47 np0005532763 podman[86692]: 2025-11-23 20:43:47.817995051 +0000 UTC m=+0.190084126 container attach a26ae8747820981968333b66179c86853c1cd6ba21b0db67cef90646d9172072 (image=quay.io/ceph/keepalived:2.2.4, name=pedantic_northcutt, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, name=keepalived, distribution-scope=public, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc.)
Nov 23 15:43:47 np0005532763 pedantic_northcutt[86708]: 0 0
Nov 23 15:43:47 np0005532763 systemd[1]: libpod-a26ae8747820981968333b66179c86853c1cd6ba21b0db67cef90646d9172072.scope: Deactivated successfully.
Nov 23 15:43:47 np0005532763 conmon[86708]: conmon a26ae874782098196833 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a26ae8747820981968333b66179c86853c1cd6ba21b0db67cef90646d9172072.scope/container/memory.events
Nov 23 15:43:47 np0005532763 podman[86692]: 2025-11-23 20:43:47.825309787 +0000 UTC m=+0.197398822 container died a26ae8747820981968333b66179c86853c1cd6ba21b0db67cef90646d9172072 (image=quay.io/ceph/keepalived:2.2.4, name=pedantic_northcutt, io.openshift.expose-services=, io.buildah.version=1.28.2, release=1793, description=keepalived for Ceph, architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, version=2.2.4, name=keepalived, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20)
Nov 23 15:43:47 np0005532763 systemd[1]: var-lib-containers-storage-overlay-ecac590e6ad46e94aa856f8ac024d4a7390cb76c954bc945c7f25cd72a8a937a-merged.mount: Deactivated successfully.
Nov 23 15:43:47 np0005532763 podman[86692]: 2025-11-23 20:43:47.875617035 +0000 UTC m=+0.247706060 container remove a26ae8747820981968333b66179c86853c1cd6ba21b0db67cef90646d9172072 (image=quay.io/ceph/keepalived:2.2.4, name=pedantic_northcutt, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, release=1793, vcs-type=git, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, version=2.2.4, architecture=x86_64, distribution-scope=public, io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=)
Nov 23 15:43:47 np0005532763 systemd[1]: libpod-conmon-a26ae8747820981968333b66179c86853c1cd6ba21b0db67cef90646d9172072.scope: Deactivated successfully.
Nov 23 15:43:47 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e69 e69: 3 total, 3 up, 3 in
Nov 23 15:43:47 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 69 pg[10.14( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=69) [2] r=0 lpr=69 pi=[57,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:47 np0005532763 systemd[1]: Reloading.
Nov 23 15:43:47 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 69 pg[10.c( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=69) [2] r=0 lpr=69 pi=[57,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:47 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 69 pg[10.4( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=69) [2] r=0 lpr=69 pi=[57,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:47 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 69 pg[10.1c( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=69) [2] r=0 lpr=69 pi=[57,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:47 np0005532763 python3.9[86690]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:43:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:43:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:43:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:43:48.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:43:48 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:43:48 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:43:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:43:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:43:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:43:48.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Nov 23 15:43:48 np0005532763 systemd[1]: Reloading.
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Nov 23 15:43:48 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:43:48 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:43:48 np0005532763 ceph-mon[75752]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Nov 23 15:43:48 np0005532763 ceph-mon[75752]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Nov 23 15:43:48 np0005532763 ceph-mon[75752]: Deploying daemon keepalived.rgw.default.compute-2.zjypck on compute-2
Nov 23 15:43:48 np0005532763 ceph-mon[75752]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 3/226 objects degraded (1.327%), 2 pgs degraded)
Nov 23 15:43:48 np0005532763 ceph-mon[75752]: Cluster is now healthy
Nov 23 15:43:48 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]': finished
Nov 23 15:43:48 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Nov 23 15:43:48 np0005532763 systemd[1]: Starting Ceph keepalived.rgw.default.compute-2.zjypck for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:43:48 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Scheduled restart job, restart counter is at 1.
Nov 23 15:43:48 np0005532763 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:43:48 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.882s CPU time.
Nov 23 15:43:48 np0005532763 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:43:48 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e70 e70: 3 total, 3 up, 3 in
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[10.14( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[57,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[10.14( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[57,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[10.4( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[57,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[10.4( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[57,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[10.1d( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=70 pruub=14.572733879s) [1] r=-1 lpr=70 pi=[63,70)/1 crt=50'991 mlcod 0'0 active pruub 143.935363770s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[10.1d( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=70 pruub=14.572706223s) [1] r=-1 lpr=70 pi=[63,70)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 143.935363770s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[6.d( v 49'39 (0'0,49'39] local-lis/les=60/61 n=1 ec=53/18 lis/c=60/60 les/c/f=61/61/0 sis=70 pruub=11.525423050s) [0] r=-1 lpr=70 pi=[60,70)/1 crt=49'39 mlcod 49'39 active pruub 140.888381958s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[10.15( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=70 pruub=14.572417259s) [1] r=-1 lpr=70 pi=[63,70)/1 crt=50'991 mlcod 0'0 active pruub 143.935394287s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[6.d( v 49'39 (0'0,49'39] local-lis/les=60/61 n=1 ec=53/18 lis/c=60/60 les/c/f=61/61/0 sis=70 pruub=11.525390625s) [0] r=-1 lpr=70 pi=[60,70)/1 crt=49'39 mlcod 0'0 unknown NOTIFY pruub 140.888381958s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[10.15( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=70 pruub=14.572388649s) [1] r=-1 lpr=70 pi=[63,70)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 143.935394287s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[10.1c( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[57,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[10.1c( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[57,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[10.d( v 50'991 (0'0,50'991] local-lis/les=63/64 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=70 pruub=14.571907043s) [1] r=-1 lpr=70 pi=[63,70)/1 crt=50'991 mlcod 0'0 active pruub 143.935668945s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[10.d( v 50'991 (0'0,50'991] local-lis/les=63/64 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=70 pruub=14.571888924s) [1] r=-1 lpr=70 pi=[63,70)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 143.935668945s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[10.5( v 65'1001 (0'0,65'1001] local-lis/les=63/64 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=70 pruub=14.571832657s) [1] r=-1 lpr=70 pi=[63,70)/1 crt=64'999 lcod 64'1000 mlcod 64'1000 active pruub 143.935729980s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[10.5( v 65'1001 (0'0,65'1001] local-lis/les=63/64 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=70 pruub=14.571795464s) [1] r=-1 lpr=70 pi=[63,70)/1 crt=64'999 lcod 64'1000 mlcod 0'0 unknown NOTIFY pruub 143.935729980s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[10.c( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[57,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[10.c( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[57,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[6.5( v 49'39 (0'0,49'39] local-lis/les=60/61 n=2 ec=53/18 lis/c=60/60 les/c/f=61/62/0 sis=70 pruub=11.527276993s) [0] r=-1 lpr=70 pi=[60,70)/1 crt=49'39 mlcod 49'39 active pruub 140.891342163s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:48 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 70 pg[6.5( v 49'39 (0'0,49'39] local-lis/les=60/61 n=2 ec=53/18 lis/c=60/60 les/c/f=61/62/0 sis=70 pruub=11.527243614s) [0] r=-1 lpr=70 pi=[60,70)/1 crt=49'39 mlcod 0'0 unknown NOTIFY pruub 140.891342163s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:48 np0005532763 podman[86948]: 2025-11-23 20:43:48.979060834 +0000 UTC m=+0.044701041 container create 59ef4751f84adab9c22f972d7c31809a950126e4d79cea27771ad9f35edd3908 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck, vcs-type=git, io.buildah.version=1.28.2, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, release=1793, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Nov 23 15:43:49 np0005532763 podman[86954]: 2025-11-23 20:43:49.007386522 +0000 UTC m=+0.048333433 container create 3c7188ef2c85599d3ea737b72a10828fd09dcb876706e18dbc8370c9d8ae1457 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Nov 23 15:43:49 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f7bb86a72bac5180183457a94dcb776fae60eafab09c9aaf29072c589466e84/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:43:49 np0005532763 podman[86948]: 2025-11-23 20:43:49.039068065 +0000 UTC m=+0.104708292 container init 59ef4751f84adab9c22f972d7c31809a950126e4d79cea27771ad9f35edd3908 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, architecture=x86_64, com.redhat.component=keepalived-container, distribution-scope=public, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph, version=2.2.4, release=1793, vendor=Red Hat, Inc.)
Nov 23 15:43:49 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4cd9000995bf526ed7de286ae00c465705c8beee74d4048adce569b9dba300e6/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Nov 23 15:43:49 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4cd9000995bf526ed7de286ae00c465705c8beee74d4048adce569b9dba300e6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:43:49 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4cd9000995bf526ed7de286ae00c465705c8beee74d4048adce569b9dba300e6/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:43:49 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4cd9000995bf526ed7de286ae00c465705c8beee74d4048adce569b9dba300e6/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.dqbktw-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:43:49 np0005532763 podman[86948]: 2025-11-23 20:43:49.046697839 +0000 UTC m=+0.112338046 container start 59ef4751f84adab9c22f972d7c31809a950126e4d79cea27771ad9f35edd3908 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck, distribution-scope=public, build-date=2023-02-22T09:23:20, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, description=keepalived for Ceph, release=1793, vendor=Red Hat, Inc., com.redhat.component=keepalived-container)
Nov 23 15:43:49 np0005532763 bash[86948]: 59ef4751f84adab9c22f972d7c31809a950126e4d79cea27771ad9f35edd3908
Nov 23 15:43:49 np0005532763 podman[86948]: 2025-11-23 20:43:48.959866813 +0000 UTC m=+0.025507060 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Nov 23 15:43:49 np0005532763 podman[86954]: 2025-11-23 20:43:49.058004778 +0000 UTC m=+0.098951699 container init 3c7188ef2c85599d3ea737b72a10828fd09dcb876706e18dbc8370c9d8ae1457 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:49 2025: Starting Keepalived v2.2.4 (08/21,2021)
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:49 2025: Running on Linux 5.14.0-639.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Sat Nov 15 10:30:41 UTC 2025 (built for Linux 5.14.0)
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:49 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:49 2025: Configuration file /etc/keepalived/keepalived.conf
Nov 23 15:43:49 np0005532763 systemd[1]: Started Ceph keepalived.rgw.default.compute-2.zjypck for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:43:49 np0005532763 podman[86954]: 2025-11-23 20:43:49.066887718 +0000 UTC m=+0.107834639 container start 3c7188ef2c85599d3ea737b72a10828fd09dcb876706e18dbc8370c9d8ae1457 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:49 2025: Failed to bind to process monitoring socket - errno 98 - Address already in use
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:49 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Nov 23 15:43:49 np0005532763 bash[86954]: 3c7188ef2c85599d3ea737b72a10828fd09dcb876706e18dbc8370c9d8ae1457
Nov 23 15:43:49 np0005532763 podman[86954]: 2025-11-23 20:43:48.990582458 +0000 UTC m=+0.031529379 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:49 2025: Starting VRRP child process, pid=4
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:49 2025: Startup complete
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:49 2025: (VI_0) Entering BACKUP STATE (init)
Nov 23 15:43:49 np0005532763 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:43:49 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:43:49 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:49 2025: VRRP_Script(check_backend) succeeded
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:43:49 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:43:49 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:43:49 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:49.135015) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930629135085, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 434, "num_deletes": 251, "total_data_size": 372779, "memory_usage": 382456, "flush_reason": "Manual Compaction"}
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:43:49 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930629141237, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 244168, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7148, "largest_seqno": 7577, "table_properties": {"data_size": 241580, "index_size": 624, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6752, "raw_average_key_size": 19, "raw_value_size": 236093, "raw_average_value_size": 672, "num_data_blocks": 26, "num_entries": 351, "num_filter_entries": 351, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930625, "oldest_key_time": 1763930625, "file_creation_time": 1763930629, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 6308 microseconds, and 2237 cpu microseconds.
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:43:49 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:49.141329) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 244168 bytes OK
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:49.141350) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:49.144588) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:49.144616) EVENT_LOG_v1 {"time_micros": 1763930629144607, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:49.144641) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 369934, prev total WAL file size 388166, number of live WAL files 2.
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:49.146051) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(238KB)], [15(12MB)]
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930629146102, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 12915481, "oldest_snapshot_seqno": -1}
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3073 keys, 11707410 bytes, temperature: kUnknown
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930629204556, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 11707410, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11682533, "index_size": 16064, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7749, "raw_key_size": 79244, "raw_average_key_size": 25, "raw_value_size": 11621536, "raw_average_value_size": 3781, "num_data_blocks": 702, "num_entries": 3073, "num_filter_entries": 3073, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763930629, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:49.204846) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 11707410 bytes
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:49.206507) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 220.6 rd, 200.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 12.1 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(100.8) write-amplify(47.9) OK, records in: 3593, records dropped: 520 output_compression: NoCompression
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:49.206543) EVENT_LOG_v1 {"time_micros": 1763930629206527, "job": 6, "event": "compaction_finished", "compaction_time_micros": 58543, "compaction_time_cpu_micros": 28886, "output_level": 6, "num_output_files": 1, "total_output_size": 11707410, "num_input_records": 3593, "num_output_records": 3073, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930629206784, "job": 6, "event": "table_file_deletion", "file_number": 17}
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930629211428, "job": 6, "event": "table_file_deletion", "file_number": 15}
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:49.145999) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:49.211496) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:49.211502) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:49.211505) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:49.211507) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:43:49.211508) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:43:49 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:43:49 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.5 scrub starts
Nov 23 15:43:49 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.5 scrub ok
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e70 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:43:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]: dispatch
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]': finished
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e71 e71: 3 total, 3 up, 3 in
Nov 23 15:43:49 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 71 pg[10.5( v 65'1001 (0'0,65'1001] local-lis/les=63/64 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=71) [1]/[2] r=0 lpr=71 pi=[63,71)/1 crt=64'999 lcod 64'1000 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:49 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 71 pg[10.5( v 65'1001 (0'0,65'1001] local-lis/les=63/64 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=71) [1]/[2] r=0 lpr=71 pi=[63,71)/1 crt=64'999 lcod 64'1000 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:49 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 71 pg[10.d( v 50'991 (0'0,50'991] local-lis/les=63/64 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=71) [1]/[2] r=0 lpr=71 pi=[63,71)/1 crt=50'991 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:49 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 71 pg[10.d( v 50'991 (0'0,50'991] local-lis/les=63/64 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=71) [1]/[2] r=0 lpr=71 pi=[63,71)/1 crt=50'991 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:49 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 71 pg[10.1d( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=71) [1]/[2] r=0 lpr=71 pi=[63,71)/1 crt=50'991 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:49 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 71 pg[10.15( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=71) [1]/[2] r=0 lpr=71 pi=[63,71)/1 crt=50'991 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:49 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 71 pg[10.1d( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=71) [1]/[2] r=0 lpr=71 pi=[63,71)/1 crt=50'991 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:49 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 71 pg[10.15( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=71) [1]/[2] r=0 lpr=71 pi=[63,71)/1 crt=50'991 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:43:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:43:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:43:50.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:43:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:43:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:43:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:43:50.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:43:50 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.2 scrub starts
Nov 23 15:43:50 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.2 scrub ok
Nov 23 15:43:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:50 np0005532763 ceph-mon[75752]: Deploying daemon prometheus.compute-0 on compute-0
Nov 23 15:43:50 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e72 e72: 3 total, 3 up, 3 in
Nov 23 15:43:50 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 72 pg[10.4( v 71'1001 (0'0,71'1001] local-lis/les=0/0 n=6 ec=57/44 lis/c=70/57 les/c/f=71/58/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 luod=0'0 crt=59'998 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:50 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 72 pg[10.4( v 71'1001 (0'0,71'1001] local-lis/les=0/0 n=6 ec=57/44 lis/c=70/57 les/c/f=71/58/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 crt=59'998 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:50 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 72 pg[10.1c( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=70/57 les/c/f=71/58/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:50 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 72 pg[10.1c( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=70/57 les/c/f=71/58/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:50 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 72 pg[10.14( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=70/57 les/c/f=71/58/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:50 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 72 pg[10.14( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=70/57 les/c/f=71/58/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:50 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 72 pg[10.c( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=70/57 les/c/f=71/58/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:50 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 72 pg[10.c( v 50'991 (0'0,50'991] local-lis/les=0/0 n=6 ec=57/44 lis/c=70/57 les/c/f=71/58/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:43:50 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 72 pg[10.15( v 50'991 (0'0,50'991] local-lis/les=71/72 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=71) [1]/[2] async=[1] r=0 lpr=71 pi=[63,71)/1 crt=50'991 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:50 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 72 pg[10.1d( v 50'991 (0'0,50'991] local-lis/les=71/72 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=71) [1]/[2] async=[1] r=0 lpr=71 pi=[63,71)/1 crt=50'991 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:50 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 72 pg[10.5( v 65'1001 (0'0,65'1001] local-lis/les=71/72 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=71) [1]/[2] async=[1] r=0 lpr=71 pi=[63,71)/1 crt=65'1001 lcod 64'1000 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:50 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 72 pg[10.d( v 50'991 (0'0,50'991] local-lis/les=71/72 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=71) [1]/[2] async=[1] r=0 lpr=71 pi=[63,71)/1 crt=50'991 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:51 np0005532763 python3.9[87178]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:43:51 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.1d scrub starts
Nov 23 15:43:51 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.1d scrub ok
Nov 23 15:43:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:51 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e73 e73: 3 total, 3 up, 3 in
Nov 23 15:43:51 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 73 pg[10.1d( v 50'991 (0'0,50'991] local-lis/les=71/72 n=5 ec=57/44 lis/c=71/63 les/c/f=72/64/0 sis=73 pruub=15.118959427s) [1] async=[1] r=-1 lpr=73 pi=[63,73)/1 crt=50'991 mlcod 50'991 active pruub 147.353530884s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:51 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 73 pg[10.1d( v 50'991 (0'0,50'991] local-lis/les=71/72 n=5 ec=57/44 lis/c=71/63 les/c/f=72/64/0 sis=73 pruub=15.118874550s) [1] r=-1 lpr=73 pi=[63,73)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 147.353530884s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:51 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 73 pg[10.15( v 50'991 (0'0,50'991] local-lis/les=71/72 n=5 ec=57/44 lis/c=71/63 les/c/f=72/64/0 sis=73 pruub=15.112724304s) [1] async=[1] r=-1 lpr=73 pi=[63,73)/1 crt=50'991 mlcod 50'991 active pruub 147.347885132s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:51 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 73 pg[10.15( v 50'991 (0'0,50'991] local-lis/les=71/72 n=5 ec=57/44 lis/c=71/63 les/c/f=72/64/0 sis=73 pruub=15.112661362s) [1] r=-1 lpr=73 pi=[63,73)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 147.347885132s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:51 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 73 pg[10.d( v 50'991 (0'0,50'991] local-lis/les=71/72 n=6 ec=57/44 lis/c=71/63 les/c/f=72/64/0 sis=73 pruub=15.117469788s) [1] async=[1] r=-1 lpr=73 pi=[63,73)/1 crt=50'991 mlcod 50'991 active pruub 147.354110718s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:51 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 73 pg[10.d( v 50'991 (0'0,50'991] local-lis/les=71/72 n=6 ec=57/44 lis/c=71/63 les/c/f=72/64/0 sis=73 pruub=15.117375374s) [1] r=-1 lpr=73 pi=[63,73)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 147.354110718s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:51 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 73 pg[10.5( v 72'1004 (0'0,72'1004] local-lis/les=71/72 n=6 ec=57/44 lis/c=71/63 les/c/f=72/64/0 sis=73 pruub=15.116751671s) [1] async=[1] r=-1 lpr=73 pi=[63,73)/1 crt=65'1001 lcod 72'1003 mlcod 72'1003 active pruub 147.353622437s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:43:51 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 73 pg[10.5( v 72'1004 (0'0,72'1004] local-lis/les=71/72 n=6 ec=57/44 lis/c=71/63 les/c/f=72/64/0 sis=73 pruub=15.116648674s) [1] r=-1 lpr=73 pi=[63,73)/1 crt=65'1001 lcod 72'1003 mlcod 0'0 unknown NOTIFY pruub 147.353622437s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:43:51 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 73 pg[10.4( v 71'1001 (0'0,71'1001] local-lis/les=72/73 n=6 ec=57/44 lis/c=70/57 les/c/f=71/58/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 crt=71'1001 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:51 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 73 pg[10.1c( v 50'991 (0'0,50'991] local-lis/les=72/73 n=5 ec=57/44 lis/c=70/57 les/c/f=71/58/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:51 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 73 pg[10.c( v 50'991 (0'0,50'991] local-lis/les=72/73 n=6 ec=57/44 lis/c=70/57 les/c/f=71/58/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:51 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 73 pg[10.14( v 50'991 (0'0,50'991] local-lis/les=72/73 n=5 ec=57/44 lis/c=70/57 les/c/f=71/58/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:43:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:43:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:43:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:43:52.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:43:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:43:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:43:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:43:52.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:43:52 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.4 scrub starts
Nov 23 15:43:52 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.4 scrub ok
Nov 23 15:43:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:52 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e74 e74: 3 total, 3 up, 3 in
Nov 23 15:43:53 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.13 scrub starts
Nov 23 15:43:53 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.13 scrub ok
Nov 23 15:43:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:53 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:43:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:43:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:43:54.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:43:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:43:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000057s ======
Nov 23 15:43:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:43:54.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Nov 23 15:43:54 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.1a deep-scrub starts
Nov 23 15:43:54 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.1a deep-scrub ok
Nov 23 15:43:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e74 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:43:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:55 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 11.19 scrub starts
Nov 23 15:43:55 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 11.19 scrub ok
Nov 23 15:43:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:43:55 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:43:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:43:55 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:43:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:43:56 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:56 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:56 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' 
Nov 23 15:43:56 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Nov 23 15:43:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:43:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:43:56.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:43:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:43:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:43:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:43:56.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:43:56 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.5 scrub starts
Nov 23 15:43:56 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.5 scrub ok
Nov 23 15:43:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:56 np0005532763 ceph-mgr[76063]: mgr handle_mgr_map respawning because set of enabled modules changed!
Nov 23 15:43:56 np0005532763 systemd[1]: session-34.scope: Deactivated successfully.
Nov 23 15:43:56 np0005532763 systemd[1]: session-34.scope: Consumed 27.549s CPU time.
Nov 23 15:43:56 np0005532763 systemd-logind[830]: Session 34 logged out. Waiting for processes to exit.
Nov 23 15:43:56 np0005532763 systemd-logind[830]: Removed session 34.
Nov 23 15:43:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: ignoring --setuser ceph since I am not root
Nov 23 15:43:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: ignoring --setgroup ceph since I am not root
Nov 23 15:43:56 np0005532763 ceph-mgr[76063]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Nov 23 15:43:56 np0005532763 ceph-mgr[76063]: pidfile_write: ignore empty --pid-file
Nov 23 15:43:56 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'alerts'
Nov 23 15:43:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:43:56.978+0000 7f848c32a140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 23 15:43:56 np0005532763 ceph-mgr[76063]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Nov 23 15:43:56 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'balancer'
Nov 23 15:43:57 np0005532763 ceph-mon[75752]: from='mgr.14424 192.168.122.100:0/3245007846' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Nov 23 15:43:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:43:57.055+0000 7f848c32a140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 23 15:43:57 np0005532763 ceph-mgr[76063]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Nov 23 15:43:57 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'cephadm'
Nov 23 15:43:57 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.1e deep-scrub starts
Nov 23 15:43:57 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.1e deep-scrub ok
Nov 23 15:43:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:57 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'crash'
Nov 23 15:43:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:43:57.828+0000 7f848c32a140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 23 15:43:57 np0005532763 ceph-mgr[76063]: mgr[py] Module crash has missing NOTIFY_TYPES member
Nov 23 15:43:57 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'dashboard'
Nov 23 15:43:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:43:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:43:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:43:58.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:43:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:43:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:43:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:43:58.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:43:58 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Nov 23 15:43:58 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Nov 23 15:43:58 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'devicehealth'
Nov 23 15:43:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:43:58.386+0000 7f848c32a140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 23 15:43:58 np0005532763 ceph-mgr[76063]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Nov 23 15:43:58 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'diskprediction_local'
Nov 23 15:43:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Nov 23 15:43:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Nov 23 15:43:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]:  from numpy import show_config as show_numpy_config
Nov 23 15:43:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:43:58.535+0000 7f848c32a140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 23 15:43:58 np0005532763 ceph-mgr[76063]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Nov 23 15:43:58 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'influx'
Nov 23 15:43:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:43:58.599+0000 7f848c32a140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 23 15:43:58 np0005532763 ceph-mgr[76063]: mgr[py] Module influx has missing NOTIFY_TYPES member
Nov 23 15:43:58 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'insights'
Nov 23 15:43:58 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'iostat'
Nov 23 15:43:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:43:58.723+0000 7f848c32a140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 23 15:43:58 np0005532763 ceph-mgr[76063]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Nov 23 15:43:58 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'k8sevents'
Nov 23 15:43:59 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'localpool'
Nov 23 15:43:59 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'mds_autoscaler'
Nov 23 15:43:59 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.17 scrub starts
Nov 23 15:43:59 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'mirroring'
Nov 23 15:43:59 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.17 scrub ok
Nov 23 15:43:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e74 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:43:59 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'nfs'
Nov 23 15:43:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:43:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:43:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:43:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:43:59.680+0000 7f848c32a140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 23 15:43:59 np0005532763 ceph-mgr[76063]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Nov 23 15:43:59 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'orchestrator'
Nov 23 15:43:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:43:59.899+0000 7f848c32a140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 23 15:43:59 np0005532763 ceph-mgr[76063]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Nov 23 15:43:59 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'osd_perf_query'
Nov 23 15:43:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:43:59.970+0000 7f848c32a140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 23 15:43:59 np0005532763 ceph-mgr[76063]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Nov 23 15:43:59 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'osd_support'
Nov 23 15:44:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:00.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:44:00.031+0000 7f848c32a140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 23 15:44:00 np0005532763 ceph-mgr[76063]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Nov 23 15:44:00 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'pg_autoscaler'
Nov 23 15:44:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:44:00.106+0000 7f848c32a140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 23 15:44:00 np0005532763 ceph-mgr[76063]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Nov 23 15:44:00 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'progress'
Nov 23 15:44:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:44:00.178+0000 7f848c32a140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 23 15:44:00 np0005532763 ceph-mgr[76063]: mgr[py] Module progress has missing NOTIFY_TYPES member
Nov 23 15:44:00 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'prometheus'
Nov 23 15:44:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:00.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:00 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 11.a scrub starts
Nov 23 15:44:00 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 11.a scrub ok
Nov 23 15:44:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:44:00.506+0000 7f848c32a140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 23 15:44:00 np0005532763 ceph-mgr[76063]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Nov 23 15:44:00 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'rbd_support'
Nov 23 15:44:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:44:00.600+0000 7f848c32a140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 23 15:44:00 np0005532763 ceph-mgr[76063]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Nov 23 15:44:00 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'restful'
Nov 23 15:44:00 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'rgw'
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:44:01.000+0000 7f848c32a140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 23 15:44:01 np0005532763 ceph-mgr[76063]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Nov 23 15:44:01 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'rook'
Nov 23 15:44:01 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.17 scrub starts
Nov 23 15:44:01 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.17 scrub ok
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:44:01.512+0000 7f848c32a140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 23 15:44:01 np0005532763 ceph-mgr[76063]: mgr[py] Module rook has missing NOTIFY_TYPES member
Nov 23 15:44:01 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'selftest'
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:44:01.579+0000 7f848c32a140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 23 15:44:01 np0005532763 ceph-mgr[76063]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Nov 23 15:44:01 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'snap_schedule'
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:44:01.658+0000 7f848c32a140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 23 15:44:01 np0005532763 ceph-mgr[76063]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Nov 23 15:44:01 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'stats'
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:44:01 np0005532763 systemd[1]: session-36.scope: Deactivated successfully.
Nov 23 15:44:01 np0005532763 systemd[1]: session-36.scope: Consumed 8.981s CPU time.
Nov 23 15:44:01 np0005532763 systemd-logind[830]: Session 36 logged out. Waiting for processes to exit.
Nov 23 15:44:01 np0005532763 systemd-logind[830]: Removed session 36.
Nov 23 15:44:01 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'status'
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:44:01.806+0000 7f848c32a140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Nov 23 15:44:01 np0005532763 ceph-mgr[76063]: mgr[py] Module status has missing NOTIFY_TYPES member
Nov 23 15:44:01 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'telegraf'
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:44:01.879+0000 7f848c32a140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 23 15:44:01 np0005532763 ceph-mgr[76063]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Nov 23 15:44:01 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'telemetry'
Nov 23 15:44:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:01 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8c0000df0 fd 36 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:02.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:44:02.034+0000 7f848c32a140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'test_orchestrator'
Nov 23 15:44:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:02.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:44:02.243+0000 7f848c32a140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'volumes'
Nov 23 15:44:02 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.f scrub starts
Nov 23 15:44:02 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.f scrub ok
Nov 23 15:44:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:44:02.501+0000 7f848c32a140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: mgr[py] Loading python module 'zabbix'
Nov 23 15:44:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:02 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8ac0016e0 fd 36 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 2025-11-23T20:44:02.581+0000 7f848c32a140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: mgr load Constructed class from module: dashboard
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: mgr load Constructed class from module: prometheus
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: [dashboard INFO root] Configured CherryPy, starting engine...
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: [dashboard INFO root] Starting engine...
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: [prometheus INFO root] server_addr: :: server_port: 9283
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: [prometheus INFO root] Starting engine...
Nov 23 15:44:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: [23/Nov/2025:20:44:02] ENGINE Bus STARTING
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: [prometheus INFO cherrypy.error] [23/Nov/2025:20:44:02] ENGINE Bus STARTING
Nov 23 15:44:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: CherryPy Checker:
Nov 23 15:44:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: The Application mounted at '' has an empty config.
Nov 23 15:44:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: 
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: ms_deliver_dispatch: unhandled message 0x55c7e94b9860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: [dashboard INFO root] Engine started...
Nov 23 15:44:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: [23/Nov/2025:20:44:02] ENGINE Serving on http://:::9283
Nov 23 15:44:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mgr-compute-2-jtkauz[76059]: [23/Nov/2025:20:44:02] ENGINE Bus STARTED
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: [prometheus INFO cherrypy.error] [23/Nov/2025:20:44:02] ENGINE Serving on http://:::9283
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: [prometheus INFO cherrypy.error] [23/Nov/2025:20:44:02] ENGINE Bus STARTED
Nov 23 15:44:02 np0005532763 ceph-mgr[76063]: [prometheus INFO root] Engine started.
Nov 23 15:44:03 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e75 e75: 3 total, 3 up, 3 in
Nov 23 15:44:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:03 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe89c000b60 fd 36 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:03 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.8 scrub starts
Nov 23 15:44:03 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.8 scrub ok
Nov 23 15:44:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:03 np0005532763 systemd-logind[830]: New session 37 of user ceph-admin.
Nov 23 15:44:03 np0005532763 systemd[1]: Started Session 37 of User ceph-admin.
Nov 23 15:44:03 np0005532763 ceph-mon[75752]: Active manager daemon compute-0.oyehye restarted
Nov 23 15:44:03 np0005532763 ceph-mon[75752]: Activating manager daemon compute-0.oyehye
Nov 23 15:44:03 np0005532763 ceph-mon[75752]: Manager daemon compute-0.oyehye is now available
Nov 23 15:44:03 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:03 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.oyehye/mirror_snapshot_schedule"}]: dispatch
Nov 23 15:44:03 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.oyehye/trash_purge_schedule"}]: dispatch
Nov 23 15:44:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:03 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe894000b60 fd 36 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:04.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:04.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e75 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:44:04 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 11.8 deep-scrub starts
Nov 23 15:44:04 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 11.8 deep-scrub ok
Nov 23 15:44:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:04 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8b8001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:04 np0005532763 podman[87444]: 2025-11-23 20:44:04.594467036 +0000 UTC m=+0.105689847 container exec 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 23 15:44:04 np0005532763 podman[87444]: 2025-11-23 20:44:04.715793825 +0000 UTC m=+0.227016586 container exec_died 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Nov 23 15:44:05 np0005532763 ceph-mon[75752]: [23/Nov/2025:20:44:04] ENGINE Bus STARTING
Nov 23 15:44:05 np0005532763 ceph-mon[75752]: [23/Nov/2025:20:44:04] ENGINE Serving on http://192.168.122.100:8765
Nov 23 15:44:05 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]: dispatch
Nov 23 15:44:05 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Nov 23 15:44:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/204405 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:44:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:05 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8ac002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:05 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e76 e76: 3 total, 3 up, 3 in
Nov 23 15:44:05 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.b scrub starts
Nov 23 15:44:05 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.b scrub ok
Nov 23 15:44:05 np0005532763 podman[87561]: 2025-11-23 20:44:05.459765516 +0000 UTC m=+0.090252374 container exec bfa89024a4f3a8c3745fbdf8141ab9c1af6ff603988de647c9e7f7e15dff8638 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 15:44:05 np0005532763 podman[87561]: 2025-11-23 20:44:05.472643198 +0000 UTC m=+0.103130006 container exec_died bfa89024a4f3a8c3745fbdf8141ab9c1af6ff603988de647c9e7f7e15dff8638 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 15:44:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:05 np0005532763 podman[87652]: 2025-11-23 20:44:05.905608777 +0000 UTC m=+0.073520723 container exec 3c7188ef2c85599d3ea737b72a10828fd09dcb876706e18dbc8370c9d8ae1457 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Nov 23 15:44:05 np0005532763 podman[87652]: 2025-11-23 20:44:05.923686426 +0000 UTC m=+0.091598312 container exec_died 3c7188ef2c85599d3ea737b72a10828fd09dcb876706e18dbc8370c9d8ae1457 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Nov 23 15:44:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:05 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8ac002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:06.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:06 np0005532763 ceph-mon[75752]: [23/Nov/2025:20:44:04] ENGINE Serving on https://192.168.122.100:7150
Nov 23 15:44:06 np0005532763 ceph-mon[75752]: [23/Nov/2025:20:44:04] ENGINE Bus STARTED
Nov 23 15:44:06 np0005532763 ceph-mon[75752]: [23/Nov/2025:20:44:04] ENGINE Client ('192.168.122.100', 33786) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Nov 23 15:44:06 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]': finished
Nov 23 15:44:06 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Nov 23 15:44:06 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:06 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:06 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e77 e77: 3 total, 3 up, 3 in
Nov 23 15:44:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:44:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:06.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:44:06 np0005532763 podman[87716]: 2025-11-23 20:44:06.250066642 +0000 UTC m=+0.090645995 container exec 187afc4c1e67339be091cc4caff41c0e2aaba4673fc086f757180d516596ee6c (image=quay.io/ceph/haproxy:2.3, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem)
Nov 23 15:44:06 np0005532763 podman[87716]: 2025-11-23 20:44:06.264721375 +0000 UTC m=+0.105300668 container exec_died 187afc4c1e67339be091cc4caff41c0e2aaba4673fc086f757180d516596ee6c (image=quay.io/ceph/haproxy:2.3, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem)
Nov 23 15:44:06 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Nov 23 15:44:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:06 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Nov 23 15:44:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:06 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8ac002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:06 np0005532763 podman[87785]: 2025-11-23 20:44:06.58932495 +0000 UTC m=+0.084910613 container exec f83166e24f35928d8e85c6352ec69e598c685dd22eb2d34bc93aec691f658844 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, release=1793, description=keepalived for Ceph, name=keepalived, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, vendor=Red Hat, Inc., version=2.2.4)
Nov 23 15:44:06 np0005532763 podman[87785]: 2025-11-23 20:44:06.608712857 +0000 UTC m=+0.104298500 container exec_died f83166e24f35928d8e85c6352ec69e598c685dd22eb2d34bc93aec691f658844 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, description=keepalived for Ceph, distribution-scope=public, io.openshift.tags=Ceph keepalived, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1793, architecture=x86_64)
Nov 23 15:44:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:07 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8b80025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:07 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:07 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:07 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:07 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:07 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 23 15:44:07 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Nov 23 15:44:07 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:07 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:07 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]: dispatch
Nov 23 15:44:07 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Nov 23 15:44:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e78 e78: 3 total, 3 up, 3 in
Nov 23 15:44:07 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 78 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=78 pruub=12.334017754s) [0] r=-1 lpr=78 pi=[63,78)/1 crt=50'991 mlcod 0'0 active pruub 159.933242798s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:07 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 78 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=78 pruub=12.333844185s) [0] r=-1 lpr=78 pi=[63,78)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 159.933242798s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:07 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 78 pg[10.f( v 50'991 (0'0,50'991] local-lis/les=63/64 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=78 pruub=12.336375237s) [0] r=-1 lpr=78 pi=[63,78)/1 crt=50'991 mlcod 0'0 active pruub 159.936035156s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:07 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 78 pg[10.f( v 50'991 (0'0,50'991] local-lis/les=63/64 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=78 pruub=12.336324692s) [0] r=-1 lpr=78 pi=[63,78)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 159.936035156s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:07 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 78 pg[10.7( v 50'991 (0'0,50'991] local-lis/les=64/65 n=6 ec=57/44 lis/c=64/64 les/c/f=65/65/0 sis=78 pruub=13.373891830s) [0] r=-1 lpr=78 pi=[64,78)/1 crt=50'991 mlcod 0'0 active pruub 160.974578857s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:07 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 78 pg[10.7( v 50'991 (0'0,50'991] local-lis/les=64/65 n=6 ec=57/44 lis/c=64/64 les/c/f=65/65/0 sis=78 pruub=13.373817444s) [0] r=-1 lpr=78 pi=[64,78)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 160.974578857s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:07 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 78 pg[10.17( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=78 pruub=12.334543228s) [0] r=-1 lpr=78 pi=[63,78)/1 crt=50'991 mlcod 0'0 active pruub 159.936019897s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:07 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 78 pg[10.17( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=78 pruub=12.334510803s) [0] r=-1 lpr=78 pi=[63,78)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 159.936019897s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:07 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.15 scrub starts
Nov 23 15:44:07 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.15 scrub ok
Nov 23 15:44:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:07 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8940016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:44:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:08.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:44:08 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e79 e79: 3 total, 3 up, 3 in
Nov 23 15:44:08 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 79 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=79) [0]/[2] r=0 lpr=79 pi=[63,79)/1 crt=50'991 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:08 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 79 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=79) [0]/[2] r=0 lpr=79 pi=[63,79)/1 crt=50'991 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:08 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 79 pg[10.f( v 50'991 (0'0,50'991] local-lis/les=63/64 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=79) [0]/[2] r=0 lpr=79 pi=[63,79)/1 crt=50'991 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:08 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 79 pg[10.f( v 50'991 (0'0,50'991] local-lis/les=63/64 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=79) [0]/[2] r=0 lpr=79 pi=[63,79)/1 crt=50'991 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:08 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 79 pg[10.7( v 50'991 (0'0,50'991] local-lis/les=64/65 n=6 ec=57/44 lis/c=64/64 les/c/f=65/65/0 sis=79) [0]/[2] r=0 lpr=79 pi=[64,79)/1 crt=50'991 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:08 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 79 pg[10.7( v 50'991 (0'0,50'991] local-lis/les=64/65 n=6 ec=57/44 lis/c=64/64 les/c/f=65/65/0 sis=79) [0]/[2] r=0 lpr=79 pi=[64,79)/1 crt=50'991 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:08 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 79 pg[10.17( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=79) [0]/[2] r=0 lpr=79 pi=[63,79)/1 crt=50'991 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:08 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 79 pg[10.17( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=79) [0]/[2] r=0 lpr=79 pi=[63,79)/1 crt=50'991 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]': finished
Nov 23 15:44:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Nov 23 15:44:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 23 15:44:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:08.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:08 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 11.13 scrub starts
Nov 23 15:44:08 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 11.13 scrub ok
Nov 23 15:44:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:08 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe89c0016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:09 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8ac002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e80 e80: 3 total, 3 up, 3 in
Nov 23 15:44:09 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 80 pg[10.17( v 50'991 (0'0,50'991] local-lis/les=79/80 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=79) [0]/[2] async=[0] r=0 lpr=79 pi=[63,79)/1 crt=50'991 mlcod 0'0 active+remapped mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:44:09 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 80 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=79/80 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=79) [0]/[2] async=[0] r=0 lpr=79 pi=[63,79)/1 crt=50'991 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:44:09 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 80 pg[10.f( v 50'991 (0'0,50'991] local-lis/les=79/80 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=79) [0]/[2] async=[0] r=0 lpr=79 pi=[63,79)/1 crt=50'991 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:44:09 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 80 pg[10.7( v 50'991 (0'0,50'991] local-lis/les=79/80 n=6 ec=57/44 lis/c=64/64 les/c/f=65/65/0 sis=79) [0]/[2] async=[0] r=0 lpr=79 pi=[64,79)/1 crt=50'991 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:44:09 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:09 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:09 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 23 15:44:09 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:44:09 np0005532763 ceph-mon[75752]: Updating compute-0:/etc/ceph/ceph.conf
Nov 23 15:44:09 np0005532763 ceph-mon[75752]: Updating compute-1:/etc/ceph/ceph.conf
Nov 23 15:44:09 np0005532763 ceph-mon[75752]: Updating compute-2:/etc/ceph/ceph.conf
Nov 23 15:44:09 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]: dispatch
Nov 23 15:44:09 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Nov 23 15:44:09 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]': finished
Nov 23 15:44:09 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Nov 23 15:44:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e80 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:44:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e81 e81: 3 total, 3 up, 3 in
Nov 23 15:44:09 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 81 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=79/80 n=5 ec=57/44 lis/c=79/63 les/c/f=80/64/0 sis=81 pruub=15.397624016s) [0] async=[0] r=-1 lpr=81 pi=[63,81)/1 crt=50'991 mlcod 50'991 active pruub 165.632522583s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:09 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 81 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=79/80 n=5 ec=57/44 lis/c=79/63 les/c/f=80/64/0 sis=81 pruub=15.397542953s) [0] r=-1 lpr=81 pi=[63,81)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 165.632522583s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:09 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 81 pg[10.7( v 50'991 (0'0,50'991] local-lis/les=79/80 n=6 ec=57/44 lis/c=79/64 les/c/f=80/65/0 sis=81 pruub=15.396560669s) [0] async=[0] r=-1 lpr=81 pi=[64,81)/1 crt=50'991 mlcod 50'991 active pruub 165.632492065s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:09 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 81 pg[10.7( v 50'991 (0'0,50'991] local-lis/les=79/80 n=6 ec=57/44 lis/c=79/64 les/c/f=80/65/0 sis=81 pruub=15.396460533s) [0] r=-1 lpr=81 pi=[64,81)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 165.632492065s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:09 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 81 pg[10.f( v 50'991 (0'0,50'991] local-lis/les=79/80 n=6 ec=57/44 lis/c=79/63 les/c/f=80/64/0 sis=81 pruub=15.396233559s) [0] async=[0] r=-1 lpr=81 pi=[63,81)/1 crt=50'991 mlcod 50'991 active pruub 165.632568359s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:09 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 81 pg[10.17( v 50'991 (0'0,50'991] local-lis/les=79/80 n=5 ec=57/44 lis/c=79/63 les/c/f=80/64/0 sis=81 pruub=15.395544052s) [0] async=[0] r=-1 lpr=81 pi=[63,81)/1 crt=50'991 mlcod 50'991 active pruub 165.632476807s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:09 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 81 pg[10.17( v 50'991 (0'0,50'991] local-lis/les=79/80 n=5 ec=57/44 lis/c=79/63 les/c/f=80/64/0 sis=81 pruub=15.395473480s) [0] r=-1 lpr=81 pi=[63,81)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 165.632476807s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:09 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 81 pg[10.f( v 50'991 (0'0,50'991] local-lis/les=79/80 n=6 ec=57/44 lis/c=79/63 les/c/f=80/64/0 sis=81 pruub=15.395801544s) [0] r=-1 lpr=81 pi=[63,81)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 165.632568359s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:09 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8b80025c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:10.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:10.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:10 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe89c0016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:10 np0005532763 ceph-mon[75752]: Updating compute-0:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.conf
Nov 23 15:44:10 np0005532763 ceph-mon[75752]: Updating compute-1:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.conf
Nov 23 15:44:10 np0005532763 ceph-mon[75752]: Updating compute-2:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.conf
Nov 23 15:44:10 np0005532763 ceph-mon[75752]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Nov 23 15:44:10 np0005532763 ceph-mon[75752]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Nov 23 15:44:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e82 e82: 3 total, 3 up, 3 in
Nov 23 15:44:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:11 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe89c0016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:11 np0005532763 ceph-mon[75752]: Updating compute-0:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.client.admin.keyring
Nov 23 15:44:11 np0005532763 ceph-mon[75752]: Updating compute-1:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.client.admin.keyring
Nov 23 15:44:11 np0005532763 ceph-mon[75752]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Nov 23 15:44:11 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:11 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:11 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:11 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:11 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:44:11 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e83 e83: 3 total, 3 up, 3 in
Nov 23 15:44:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:11 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8ac002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:12.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:12.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:12 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe894001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:12 np0005532763 ceph-mon[75752]: Updating compute-2:/var/lib/ceph/03808be8-ae4a-5548-82e6-4a294f1bc627/config/ceph.client.admin.keyring
Nov 23 15:44:12 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e84 e84: 3 total, 3 up, 3 in
Nov 23 15:44:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:13 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe894001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:13 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8b80032d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:14.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:14.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:44:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:14 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe8ac002000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[86979]: 23/11/2025 20:44:15 : epoch 69237205 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe894001fc0 fd 47 proxy ignored for local
Nov 23 15:44:15 np0005532763 kernel: ganesha.nfsd[87279]: segfault at 50 ip 00007fe96cbd032e sp 00007fe9377fd210 error 4 in libntirpc.so.5.8[7fe96cbb5000+2c000] likely on CPU 6 (core 0, socket 6)
Nov 23 15:44:15 np0005532763 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Nov 23 15:44:15 np0005532763 systemd[1]: Started Process Core Dump (PID 88902/UID 0).
Nov 23 15:44:15 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.b scrub starts
Nov 23 15:44:15 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.b scrub ok
Nov 23 15:44:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:15 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]: dispatch
Nov 23 15:44:15 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Nov 23 15:44:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e85 e85: 3 total, 3 up, 3 in
Nov 23 15:44:15 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 85 pg[10.19( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=85 pruub=11.601327896s) [0] r=-1 lpr=85 pi=[63,85)/1 crt=50'991 mlcod 0'0 active pruub 167.933502197s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:15 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 85 pg[10.19( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=85 pruub=11.601292610s) [0] r=-1 lpr=85 pi=[63,85)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 167.933502197s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:15 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 85 pg[6.9( v 49'39 (0'0,49'39] local-lis/les=60/61 n=0 ec=53/18 lis/c=60/60 les/c/f=61/61/0 sis=85 pruub=8.557905197s) [1] r=-1 lpr=85 pi=[60,85)/1 crt=49'39 lcod 0'0 mlcod 0'0 active pruub 164.891906738s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:15 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 85 pg[6.9( v 49'39 (0'0,49'39] local-lis/les=60/61 n=0 ec=53/18 lis/c=60/60 les/c/f=61/61/0 sis=85 pruub=8.557882309s) [1] r=-1 lpr=85 pi=[60,85)/1 crt=49'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 164.891906738s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:15 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 85 pg[10.9( v 50'991 (0'0,50'991] local-lis/les=64/65 n=6 ec=57/44 lis/c=64/64 les/c/f=65/65/0 sis=85 pruub=12.634770393s) [0] r=-1 lpr=85 pi=[64,85)/1 crt=50'991 mlcod 0'0 active pruub 168.969146729s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:15 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 85 pg[10.9( v 50'991 (0'0,50'991] local-lis/les=64/65 n=6 ec=57/44 lis/c=64/64 les/c/f=65/65/0 sis=85 pruub=12.634681702s) [0] r=-1 lpr=85 pi=[64,85)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 168.969146729s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:16.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:16.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:16 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.9 deep-scrub starts
Nov 23 15:44:16 np0005532763 systemd-coredump[88903]: Process 86986 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 42:#012#0  0x00007fe96cbd032e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Nov 23 15:44:16 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.9 deep-scrub ok
Nov 23 15:44:16 np0005532763 systemd[1]: systemd-coredump@1-88902-0.service: Deactivated successfully.
Nov 23 15:44:16 np0005532763 systemd[1]: systemd-coredump@1-88902-0.service: Consumed 1.152s CPU time.
Nov 23 15:44:16 np0005532763 podman[88959]: 2025-11-23 20:44:16.454466165 +0000 UTC m=+0.028603297 container died 3c7188ef2c85599d3ea737b72a10828fd09dcb876706e18dbc8370c9d8ae1457 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 23 15:44:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:16 np0005532763 systemd[1]: var-lib-containers-storage-overlay-4cd9000995bf526ed7de286ae00c465705c8beee74d4048adce569b9dba300e6-merged.mount: Deactivated successfully.
Nov 23 15:44:16 np0005532763 systemd[82270]: Starting Mark boot as successful...
Nov 23 15:44:16 np0005532763 podman[88959]: 2025-11-23 20:44:16.504573006 +0000 UTC m=+0.078710098 container remove 3c7188ef2c85599d3ea737b72a10828fd09dcb876706e18dbc8370c9d8ae1457 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Nov 23 15:44:16 np0005532763 systemd[82270]: Finished Mark boot as successful.
Nov 23 15:44:16 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Main process exited, code=exited, status=139/n/a
Nov 23 15:44:16 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Failed with result 'exit-code'.
Nov 23 15:44:16 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.692s CPU time.
Nov 23 15:44:16 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e86 e86: 3 total, 3 up, 3 in
Nov 23 15:44:16 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 86 pg[10.19( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=86) [0]/[2] r=0 lpr=86 pi=[63,86)/1 crt=50'991 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:16 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 86 pg[10.19( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=86) [0]/[2] r=0 lpr=86 pi=[63,86)/1 crt=50'991 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:16 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 86 pg[10.9( v 50'991 (0'0,50'991] local-lis/les=64/65 n=6 ec=57/44 lis/c=64/64 les/c/f=65/65/0 sis=86) [0]/[2] r=0 lpr=86 pi=[64,86)/1 crt=50'991 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:16 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 86 pg[10.9( v 50'991 (0'0,50'991] local-lis/les=64/65 n=6 ec=57/44 lis/c=64/64 les/c/f=65/65/0 sis=86) [0]/[2] r=0 lpr=86 pi=[64,86)/1 crt=50'991 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:16 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]': finished
Nov 23 15:44:16 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Nov 23 15:44:16 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:16 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:16 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:16 np0005532763 ceph-mon[75752]: Reconfiguring mon.compute-0 (monmap changed)...
Nov 23 15:44:16 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 23 15:44:16 np0005532763 ceph-mon[75752]: Reconfiguring daemon mon.compute-0 on compute-0
Nov 23 15:44:17 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 11.17 scrub starts
Nov 23 15:44:17 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 11.17 scrub ok
Nov 23 15:44:17 np0005532763 systemd-logind[830]: New session 38 of user zuul.
Nov 23 15:44:17 np0005532763 systemd[1]: Started Session 38 of User zuul.
Nov 23 15:44:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:17 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:17 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:17 np0005532763 ceph-mon[75752]: Reconfiguring mgr.compute-0.oyehye (monmap changed)...
Nov 23 15:44:17 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.oyehye", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Nov 23 15:44:17 np0005532763 ceph-mon[75752]: Reconfiguring daemon mgr.compute-0.oyehye on compute-0
Nov 23 15:44:17 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]: dispatch
Nov 23 15:44:17 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Nov 23 15:44:17 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:17 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:17 np0005532763 ceph-mon[75752]: Reconfiguring crash.compute-0 (monmap changed)...
Nov 23 15:44:17 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Nov 23 15:44:17 np0005532763 ceph-mon[75752]: Reconfiguring daemon crash.compute-0 on compute-0
Nov 23 15:44:17 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e87 e87: 3 total, 3 up, 3 in
Nov 23 15:44:17 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 87 pg[10.19( v 50'991 (0'0,50'991] local-lis/les=86/87 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=86) [0]/[2] async=[0] r=0 lpr=86 pi=[63,86)/1 crt=50'991 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:44:17 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 87 pg[10.9( v 50'991 (0'0,50'991] local-lis/les=86/87 n=6 ec=57/44 lis/c=64/64 les/c/f=65/65/0 sis=86) [0]/[2] async=[0] r=0 lpr=86 pi=[64,86)/1 crt=50'991 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:44:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:18.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:18 np0005532763 python3.9[89154]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 23 15:44:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:18.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:18 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.a deep-scrub starts
Nov 23 15:44:18 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.a deep-scrub ok
Nov 23 15:44:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:18 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]': finished
Nov 23 15:44:18 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Nov 23 15:44:18 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:18 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:18 np0005532763 ceph-mon[75752]: Reconfiguring osd.1 (monmap changed)...
Nov 23 15:44:18 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Nov 23 15:44:18 np0005532763 ceph-mon[75752]: Reconfiguring daemon osd.1 on compute-0
Nov 23 15:44:18 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:18 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e88 e88: 3 total, 3 up, 3 in
Nov 23 15:44:19 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 88 pg[10.19( v 50'991 (0'0,50'991] local-lis/les=86/87 n=5 ec=57/44 lis/c=86/63 les/c/f=87/64/0 sis=88 pruub=14.948975563s) [0] async=[0] r=-1 lpr=88 pi=[63,88)/1 crt=50'991 mlcod 50'991 active pruub 174.391448975s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:19 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 88 pg[10.19( v 50'991 (0'0,50'991] local-lis/les=86/87 n=5 ec=57/44 lis/c=86/63 les/c/f=87/64/0 sis=88 pruub=14.948861122s) [0] r=-1 lpr=88 pi=[63,88)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 174.391448975s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:19 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 88 pg[10.9( v 50'991 (0'0,50'991] local-lis/les=86/87 n=6 ec=57/44 lis/c=86/64 les/c/f=87/65/0 sis=88 pruub=14.952485085s) [0] async=[0] r=-1 lpr=88 pi=[64,88)/1 crt=50'991 mlcod 50'991 active pruub 174.396347046s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:19 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 88 pg[10.9( v 50'991 (0'0,50'991] local-lis/les=86/87 n=6 ec=57/44 lis/c=86/64 les/c/f=87/65/0 sis=88 pruub=14.952425003s) [0] r=-1 lpr=88 pi=[64,88)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 174.396347046s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:19 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 11.16 deep-scrub starts
Nov 23 15:44:19 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 11.16 deep-scrub ok
Nov 23 15:44:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:44:19 np0005532763 python3.9[89330]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:44:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:20.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:20.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:20 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Nov 23 15:44:20 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Nov 23 15:44:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:20 np0005532763 python3.9[89487]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:44:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/204421 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:44:21 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Nov 23 15:44:21 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Nov 23 15:44:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:21 np0005532763 python3.9[89641]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:44:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:44:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:22.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:44:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:22.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:22 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.16 scrub starts
Nov 23 15:44:22 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 8.16 scrub ok
Nov 23 15:44:22 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e89 e89: 3 total, 3 up, 3 in
Nov 23 15:44:22 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 89 pg[10.b( v 50'991 (0'0,50'991] local-lis/les=63/64 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=89 pruub=13.091240883s) [0] r=-1 lpr=89 pi=[63,89)/1 crt=50'991 mlcod 0'0 active pruub 175.936370850s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:22 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 89 pg[10.b( v 50'991 (0'0,50'991] local-lis/les=63/64 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=89 pruub=13.091188431s) [0] r=-1 lpr=89 pi=[63,89)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 175.936370850s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:22 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 89 pg[10.1b( v 50'991 (0'0,50'991] local-lis/les=64/65 n=5 ec=57/44 lis/c=64/64 les/c/f=65/65/0 sis=89 pruub=14.129295349s) [0] r=-1 lpr=89 pi=[64,89)/1 crt=50'991 mlcod 0'0 active pruub 176.974975586s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:22 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 89 pg[10.1b( v 50'991 (0'0,50'991] local-lis/les=64/65 n=5 ec=57/44 lis/c=64/64 les/c/f=65/65/0 sis=89 pruub=14.129126549s) [0] r=-1 lpr=89 pi=[64,89)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 176.974975586s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:22 np0005532763 ceph-mon[75752]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Nov 23 15:44:22 np0005532763 ceph-mon[75752]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Nov 23 15:44:22 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]: dispatch
Nov 23 15:44:22 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Nov 23 15:44:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:23 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Nov 23 15:44:23 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Nov 23 15:44:23 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e90 e90: 3 total, 3 up, 3 in
Nov 23 15:44:23 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 90 pg[10.b( v 50'991 (0'0,50'991] local-lis/les=63/64 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=90) [0]/[2] r=0 lpr=90 pi=[63,90)/1 crt=50'991 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:23 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 90 pg[10.b( v 50'991 (0'0,50'991] local-lis/les=63/64 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=90) [0]/[2] r=0 lpr=90 pi=[63,90)/1 crt=50'991 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:23 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 90 pg[10.1b( v 50'991 (0'0,50'991] local-lis/les=64/65 n=5 ec=57/44 lis/c=64/64 les/c/f=65/65/0 sis=90) [0]/[2] r=0 lpr=90 pi=[64,90)/1 crt=50'991 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:23 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 90 pg[10.1b( v 50'991 (0'0,50'991] local-lis/les=64/65 n=5 ec=57/44 lis/c=64/64 les/c/f=65/65/0 sis=90) [0]/[2] r=0 lpr=90 pi=[64,90)/1 crt=50'991 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:23 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]': finished
Nov 23 15:44:23 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Nov 23 15:44:23 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:23 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:23 np0005532763 python3.9[89797]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:44:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:24.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:24 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.3 deep-scrub starts
Nov 23 15:44:24 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 12.3 deep-scrub ok
Nov 23 15:44:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:24.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:44:24 np0005532763 python3.9[89949]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:44:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e91 e91: 3 total, 3 up, 3 in
Nov 23 15:44:24 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 91 pg[10.b( v 50'991 (0'0,50'991] local-lis/les=90/91 n=6 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=90) [0]/[2] async=[0] r=0 lpr=90 pi=[63,90)/1 crt=50'991 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:44:24 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 91 pg[10.1b( v 50'991 (0'0,50'991] local-lis/les=90/91 n=5 ec=57/44 lis/c=64/64 les/c/f=65/65/0 sis=90) [0]/[2] async=[0] r=0 lpr=90 pi=[64,90)/1 crt=50'991 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:44:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:25 np0005532763 python3.9[90101]: ansible-ansible.builtin.service_facts Invoked
Nov 23 15:44:25 np0005532763 ceph-mon[75752]: Reconfiguring grafana.compute-0 (dependencies changed)...
Nov 23 15:44:25 np0005532763 ceph-mon[75752]: Reconfiguring daemon grafana.compute-0 on compute-0
Nov 23 15:44:25 np0005532763 network[90118]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 15:44:25 np0005532763 network[90119]: 'network-scripts' will be removed from distribution in near future.
Nov 23 15:44:25 np0005532763 network[90120]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 15:44:25 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e92 e92: 3 total, 3 up, 3 in
Nov 23 15:44:25 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 92 pg[10.1b( v 50'991 (0'0,50'991] local-lis/les=90/91 n=5 ec=57/44 lis/c=90/64 les/c/f=91/65/0 sis=92 pruub=14.995903015s) [0] async=[0] r=-1 lpr=92 pi=[64,92)/1 crt=50'991 mlcod 50'991 active pruub 180.867584229s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:25 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 92 pg[10.b( v 50'991 (0'0,50'991] local-lis/les=90/91 n=6 ec=57/44 lis/c=90/63 les/c/f=91/64/0 sis=92 pruub=14.995473862s) [0] async=[0] r=-1 lpr=92 pi=[63,92)/1 crt=50'991 mlcod 50'991 active pruub 180.867584229s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:25 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 92 pg[10.1b( v 50'991 (0'0,50'991] local-lis/les=90/91 n=5 ec=57/44 lis/c=90/64 les/c/f=91/65/0 sis=92 pruub=14.995826721s) [0] r=-1 lpr=92 pi=[64,92)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 180.867584229s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:25 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 92 pg[10.b( v 50'991 (0'0,50'991] local-lis/les=90/91 n=6 ec=57/44 lis/c=90/63 les/c/f=91/64/0 sis=92 pruub=14.995191574s) [0] r=-1 lpr=92 pi=[63,92)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 180.867584229s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:26.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:26.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:26 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e93 e93: 3 total, 3 up, 3 in
Nov 23 15:44:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:26 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Scheduled restart job, restart counter is at 2.
Nov 23 15:44:26 np0005532763 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:44:26 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.692s CPU time.
Nov 23 15:44:26 np0005532763 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:44:27 np0005532763 podman[90203]: 2025-11-23 20:44:27.116942862 +0000 UTC m=+0.074159130 container create ba937a03c234c255b66fb413c02373f0864f4ff39a7110f98ac7cdf05f3554e3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Nov 23 15:44:27 np0005532763 podman[90203]: 2025-11-23 20:44:27.085153967 +0000 UTC m=+0.042370245 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:44:27 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/715824f1d1bc139beb6d2e62c6eb1d80a439250a6df66ab9cbab78cff42d6e41/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Nov 23 15:44:27 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/715824f1d1bc139beb6d2e62c6eb1d80a439250a6df66ab9cbab78cff42d6e41/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:44:27 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/715824f1d1bc139beb6d2e62c6eb1d80a439250a6df66ab9cbab78cff42d6e41/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:44:27 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/715824f1d1bc139beb6d2e62c6eb1d80a439250a6df66ab9cbab78cff42d6e41/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.dqbktw-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:44:27 np0005532763 podman[90203]: 2025-11-23 20:44:27.204749416 +0000 UTC m=+0.161965684 container init ba937a03c234c255b66fb413c02373f0864f4ff39a7110f98ac7cdf05f3554e3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Nov 23 15:44:27 np0005532763 podman[90203]: 2025-11-23 20:44:27.214157981 +0000 UTC m=+0.171374259 container start ba937a03c234c255b66fb413c02373f0864f4ff39a7110f98ac7cdf05f3554e3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:44:27 np0005532763 bash[90203]: ba937a03c234c255b66fb413c02373f0864f4ff39a7110f98ac7cdf05f3554e3
Nov 23 15:44:27 np0005532763 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:44:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:27 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Nov 23 15:44:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:27 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Nov 23 15:44:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:27 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Nov 23 15:44:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:27 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Nov 23 15:44:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:27 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Nov 23 15:44:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:27 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Nov 23 15:44:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:27 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Nov 23 15:44:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:27 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:44:27 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:27 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:27 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Nov 23 15:44:27 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:27 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/204427 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:44:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:44:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:28.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:44:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:28.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:28 np0005532763 ceph-mon[75752]: Reconfiguring crash.compute-1 (monmap changed)...
Nov 23 15:44:28 np0005532763 ceph-mon[75752]: Reconfiguring daemon crash.compute-1 on compute-1
Nov 23 15:44:28 np0005532763 ceph-mon[75752]: Reconfiguring osd.0 (monmap changed)...
Nov 23 15:44:28 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Nov 23 15:44:28 np0005532763 ceph-mon[75752]: Reconfiguring daemon osd.0 on compute-1
Nov 23 15:44:28 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:28 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:28 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 23 15:44:29 np0005532763 podman[90427]: 2025-11-23 20:44:29.338959407 +0000 UTC m=+0.067776601 container create fb229bcef22ee3eab3e0f25a01645d247a0fe11b9d8531c7c06448c48a17a9cf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_shtern, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Nov 23 15:44:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:44:29 np0005532763 systemd[1]: Started libpod-conmon-fb229bcef22ee3eab3e0f25a01645d247a0fe11b9d8531c7c06448c48a17a9cf.scope.
Nov 23 15:44:29 np0005532763 podman[90427]: 2025-11-23 20:44:29.311488973 +0000 UTC m=+0.040306207 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:44:29 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:44:29 np0005532763 podman[90427]: 2025-11-23 20:44:29.45301946 +0000 UTC m=+0.181836694 container init fb229bcef22ee3eab3e0f25a01645d247a0fe11b9d8531c7c06448c48a17a9cf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_shtern, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:44:29 np0005532763 podman[90427]: 2025-11-23 20:44:29.464686469 +0000 UTC m=+0.193503683 container start fb229bcef22ee3eab3e0f25a01645d247a0fe11b9d8531c7c06448c48a17a9cf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_shtern, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Nov 23 15:44:29 np0005532763 podman[90427]: 2025-11-23 20:44:29.468607039 +0000 UTC m=+0.197424263 container attach fb229bcef22ee3eab3e0f25a01645d247a0fe11b9d8531c7c06448c48a17a9cf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_shtern, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Nov 23 15:44:29 np0005532763 dreamy_shtern[90473]: 167 167
Nov 23 15:44:29 np0005532763 systemd[1]: libpod-fb229bcef22ee3eab3e0f25a01645d247a0fe11b9d8531c7c06448c48a17a9cf.scope: Deactivated successfully.
Nov 23 15:44:29 np0005532763 podman[90427]: 2025-11-23 20:44:29.47324377 +0000 UTC m=+0.202060984 container died fb229bcef22ee3eab3e0f25a01645d247a0fe11b9d8531c7c06448c48a17a9cf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_shtern, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1)
Nov 23 15:44:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:29 np0005532763 systemd[1]: var-lib-containers-storage-overlay-40fc12a9680d2d8347d0fcca35dbcd567bb882d5726326d034bbc235b3f8f179-merged.mount: Deactivated successfully.
Nov 23 15:44:29 np0005532763 podman[90427]: 2025-11-23 20:44:29.533448566 +0000 UTC m=+0.262265760 container remove fb229bcef22ee3eab3e0f25a01645d247a0fe11b9d8531c7c06448c48a17a9cf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=dreamy_shtern, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:44:29 np0005532763 systemd[1]: libpod-conmon-fb229bcef22ee3eab3e0f25a01645d247a0fe11b9d8531c7c06448c48a17a9cf.scope: Deactivated successfully.
Nov 23 15:44:29 np0005532763 ceph-mon[75752]: Reconfiguring mon.compute-1 (monmap changed)...
Nov 23 15:44:29 np0005532763 ceph-mon[75752]: Reconfiguring daemon mon.compute-1 on compute-1
Nov 23 15:44:29 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:29 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:29 np0005532763 ceph-mon[75752]: Reconfiguring mon.compute-2 (monmap changed)...
Nov 23 15:44:29 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Nov 23 15:44:29 np0005532763 ceph-mon[75752]: Reconfiguring daemon mon.compute-2 on compute-2
Nov 23 15:44:29 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:29 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:29 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.jtkauz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Nov 23 15:44:29 np0005532763 python3.9[90618]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:44:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:30.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:30 np0005532763 podman[90678]: 2025-11-23 20:44:30.171950025 +0000 UTC m=+0.066548136 container create 1dbdea0a22dca1de914c6b65733bae387579c4d1e7e6cf435edab2de1d500430 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_bhaskara, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2)
Nov 23 15:44:30 np0005532763 systemd[1]: Started libpod-conmon-1dbdea0a22dca1de914c6b65733bae387579c4d1e7e6cf435edab2de1d500430.scope.
Nov 23 15:44:30 np0005532763 podman[90678]: 2025-11-23 20:44:30.145717096 +0000 UTC m=+0.040315267 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:44:30 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 10.4 scrub starts
Nov 23 15:44:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:30.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:30 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:44:30 np0005532763 podman[90678]: 2025-11-23 20:44:30.275551554 +0000 UTC m=+0.170149645 container init 1dbdea0a22dca1de914c6b65733bae387579c4d1e7e6cf435edab2de1d500430 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_bhaskara, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:44:30 np0005532763 podman[90678]: 2025-11-23 20:44:30.281364607 +0000 UTC m=+0.175962718 container start 1dbdea0a22dca1de914c6b65733bae387579c4d1e7e6cf435edab2de1d500430 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_bhaskara, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1)
Nov 23 15:44:30 np0005532763 podman[90678]: 2025-11-23 20:44:30.285298178 +0000 UTC m=+0.179896239 container attach 1dbdea0a22dca1de914c6b65733bae387579c4d1e7e6cf435edab2de1d500430 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_bhaskara, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1)
Nov 23 15:44:30 np0005532763 beautiful_bhaskara[90701]: 167 167
Nov 23 15:44:30 np0005532763 systemd[1]: libpod-1dbdea0a22dca1de914c6b65733bae387579c4d1e7e6cf435edab2de1d500430.scope: Deactivated successfully.
Nov 23 15:44:30 np0005532763 podman[90678]: 2025-11-23 20:44:30.286640226 +0000 UTC m=+0.181238297 container died 1dbdea0a22dca1de914c6b65733bae387579c4d1e7e6cf435edab2de1d500430 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_bhaskara, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Nov 23 15:44:30 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 10.4 scrub ok
Nov 23 15:44:30 np0005532763 systemd[1]: var-lib-containers-storage-overlay-554ed7d684156e5b5b91eb68c765a94651e1208a327b8597cee57e856cebed24-merged.mount: Deactivated successfully.
Nov 23 15:44:30 np0005532763 podman[90678]: 2025-11-23 20:44:30.334302569 +0000 UTC m=+0.228900690 container remove 1dbdea0a22dca1de914c6b65733bae387579c4d1e7e6cf435edab2de1d500430 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=beautiful_bhaskara, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Nov 23 15:44:30 np0005532763 systemd[1]: libpod-conmon-1dbdea0a22dca1de914c6b65733bae387579c4d1e7e6cf435edab2de1d500430.scope: Deactivated successfully.
Nov 23 15:44:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:30 np0005532763 ceph-mon[75752]: Reconfiguring mgr.compute-2.jtkauz (monmap changed)...
Nov 23 15:44:30 np0005532763 ceph-mon[75752]: Reconfiguring daemon mgr.compute-2.jtkauz on compute-2
Nov 23 15:44:30 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:30 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:30 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Nov 23 15:44:30 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:30 np0005532763 python3.9[90838]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:44:31 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 10.13 scrub starts
Nov 23 15:44:31 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 10.13 scrub ok
Nov 23 15:44:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:31 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e94 e94: 3 total, 3 up, 3 in
Nov 23 15:44:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 94 pg[10.1c( v 50'991 (0'0,50'991] local-lis/les=72/73 n=5 ec=57/44 lis/c=72/72 les/c/f=73/73/0 sis=94 pruub=8.074006081s) [0] r=-1 lpr=94 pi=[72,94)/1 crt=50'991 mlcod 0'0 active pruub 180.244247437s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 94 pg[10.1c( v 50'991 (0'0,50'991] local-lis/les=72/73 n=5 ec=57/44 lis/c=72/72 les/c/f=73/73/0 sis=94 pruub=8.073841095s) [0] r=-1 lpr=94 pi=[72,94)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 180.244247437s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 94 pg[10.c( v 50'991 (0'0,50'991] local-lis/les=72/73 n=6 ec=57/44 lis/c=72/72 les/c/f=73/73/0 sis=94 pruub=8.072905540s) [0] r=-1 lpr=94 pi=[72,94)/1 crt=50'991 mlcod 0'0 active pruub 180.244262695s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:31 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 94 pg[10.c( v 50'991 (0'0,50'991] local-lis/les=72/73 n=6 ec=57/44 lis/c=72/72 les/c/f=73/73/0 sis=94 pruub=8.072491646s) [0] r=-1 lpr=94 pi=[72,94)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 180.244262695s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:31 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]: dispatch
Nov 23 15:44:31 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Nov 23 15:44:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:32.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:44:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:32.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:44:32 np0005532763 python3.9[90993]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:44:32 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 10.1 deep-scrub starts
Nov 23 15:44:32 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 10.1 deep-scrub ok
Nov 23 15:44:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:32 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]': finished
Nov 23 15:44:32 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Nov 23 15:44:32 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e95 e95: 3 total, 3 up, 3 in
Nov 23 15:44:32 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 95 pg[10.c( v 50'991 (0'0,50'991] local-lis/les=72/73 n=6 ec=57/44 lis/c=72/72 les/c/f=73/73/0 sis=95) [0]/[2] r=0 lpr=95 pi=[72,95)/1 crt=50'991 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:32 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 95 pg[10.c( v 50'991 (0'0,50'991] local-lis/les=72/73 n=6 ec=57/44 lis/c=72/72 les/c/f=73/73/0 sis=95) [0]/[2] r=0 lpr=95 pi=[72,95)/1 crt=50'991 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:32 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 95 pg[10.1c( v 50'991 (0'0,50'991] local-lis/les=72/73 n=5 ec=57/44 lis/c=72/72 les/c/f=73/73/0 sis=95) [0]/[2] r=0 lpr=95 pi=[72,95)/1 crt=50'991 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:32 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 95 pg[10.1c( v 50'991 (0'0,50'991] local-lis/les=72/73 n=5 ec=57/44 lis/c=72/72 les/c/f=73/73/0 sis=95) [0]/[2] r=0 lpr=95 pi=[72,95)/1 crt=50'991 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:33 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Nov 23 15:44:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:33 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:44:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:33 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:44:33 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Nov 23 15:44:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:33 np0005532763 python3.9[91154]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:44:33 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]: dispatch
Nov 23 15:44:33 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Nov 23 15:44:33 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:33 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:33 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:44:33 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]: dispatch
Nov 23 15:44:33 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Nov 23 15:44:33 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:33 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:33 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:44:33 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e96 e96: 3 total, 3 up, 3 in
Nov 23 15:44:33 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 96 pg[10.c( v 50'991 (0'0,50'991] local-lis/les=95/96 n=6 ec=57/44 lis/c=72/72 les/c/f=73/73/0 sis=95) [0]/[2] async=[0] r=0 lpr=95 pi=[72,95)/1 crt=50'991 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:44:33 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 96 pg[10.1c( v 50'991 (0'0,50'991] local-lis/les=95/96 n=5 ec=57/44 lis/c=72/72 les/c/f=73/73/0 sis=95) [0]/[2] async=[0] r=0 lpr=95 pi=[72,95)/1 crt=50'991 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:44:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:34.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:34.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:34 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Nov 23 15:44:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:44:34 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Nov 23 15:44:34 np0005532763 python3.9[91238]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:44:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:34 np0005532763 ceph-mon[75752]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Nov 23 15:44:34 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]': finished
Nov 23 15:44:34 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Nov 23 15:44:34 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]': finished
Nov 23 15:44:34 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Nov 23 15:44:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e97 e97: 3 total, 3 up, 3 in
Nov 23 15:44:34 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 97 pg[10.1c( v 50'991 (0'0,50'991] local-lis/les=95/96 n=5 ec=57/44 lis/c=95/72 les/c/f=96/73/0 sis=97 pruub=15.005473137s) [0] async=[0] r=-1 lpr=97 pi=[72,97)/1 crt=50'991 mlcod 50'991 active pruub 190.246154785s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:34 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 97 pg[10.1c( v 50'991 (0'0,50'991] local-lis/les=95/96 n=5 ec=57/44 lis/c=95/72 les/c/f=96/73/0 sis=97 pruub=15.005349159s) [0] r=-1 lpr=97 pi=[72,97)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 190.246154785s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:34 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 97 pg[10.c( v 50'991 (0'0,50'991] local-lis/les=95/96 n=6 ec=57/44 lis/c=95/72 les/c/f=96/73/0 sis=97 pruub=15.000094414s) [0] async=[0] r=-1 lpr=97 pi=[72,97)/1 crt=50'991 mlcod 50'991 active pruub 190.241455078s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:34 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 97 pg[10.c( v 50'991 (0'0,50'991] local-lis/les=95/96 n=6 ec=57/44 lis/c=95/72 les/c/f=96/73/0 sis=97 pruub=14.999980927s) [0] r=-1 lpr=97 pi=[72,97)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 190.241455078s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:35 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Nov 23 15:44:35 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Nov 23 15:44:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:35 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]: dispatch
Nov 23 15:44:35 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Nov 23 15:44:35 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e98 e98: 3 total, 3 up, 3 in
Nov 23 15:44:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:36.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:36.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:36 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 10.14 scrub starts
Nov 23 15:44:36 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 10.14 scrub ok
Nov 23 15:44:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:36 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]': finished
Nov 23 15:44:36 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Nov 23 15:44:36 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e99 e99: 3 total, 3 up, 3 in
Nov 23 15:44:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:37 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 23 15:44:37 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Nov 23 15:44:37 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e100 e100: 3 total, 3 up, 3 in
Nov 23 15:44:37 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 100 pg[10.f( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=81/81 les/c/f=82/82/0 sis=100) [2] r=0 lpr=100 pi=[81,100)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:37 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 100 pg[10.1f( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=81/81 les/c/f=82/82/0 sis=100) [2] r=0 lpr=100 pi=[81,100)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:38.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:38.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:38 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 23 15:44:38 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Nov 23 15:44:38 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:38 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:38 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e101 e101: 3 total, 3 up, 3 in
Nov 23 15:44:38 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 101 pg[10.1f( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=81/81 les/c/f=82/82/0 sis=101) [2]/[0] r=-1 lpr=101 pi=[81,101)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:38 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 101 pg[10.1f( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=81/81 les/c/f=82/82/0 sis=101) [2]/[0] r=-1 lpr=101 pi=[81,101)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:38 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 101 pg[10.f( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=81/81 les/c/f=82/82/0 sis=101) [2]/[0] r=-1 lpr=101 pi=[81,101)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:38 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 101 pg[10.f( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=81/81 les/c/f=82/82/0 sis=101) [2]/[0] r=-1 lpr=101 pi=[81,101)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:39 : epoch 6923722b : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e102 e102: 3 total, 3 up, 3 in
Nov 23 15:44:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:40 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca0000df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:40.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:40.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:40 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e103 e103: 3 total, 3 up, 3 in
Nov 23 15:44:40 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 103 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=101/81 les/c/f=102/82/0 sis=103) [2] r=0 lpr=103 pi=[81,103)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:40 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 103 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=101/81 les/c/f=102/82/0 sis=103) [2] r=0 lpr=103 pi=[81,103)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:40 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 103 pg[10.f( v 50'991 (0'0,50'991] local-lis/les=0/0 n=7 ec=57/44 lis/c=101/81 les/c/f=102/82/0 sis=103) [2] r=0 lpr=103 pi=[81,103)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:40 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 103 pg[10.f( v 50'991 (0'0,50'991] local-lis/les=0/0 n=7 ec=57/44 lis/c=101/81 les/c/f=102/82/0 sis=103) [2] r=0 lpr=103 pi=[81,103)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:40 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:41 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:41 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e104 e104: 3 total, 3 up, 3 in
Nov 23 15:44:41 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 104 pg[10.f( v 50'991 (0'0,50'991] local-lis/les=103/104 n=7 ec=57/44 lis/c=101/81 les/c/f=102/82/0 sis=103) [2] r=0 lpr=103 pi=[81,103)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:44:41 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 104 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=103/104 n=5 ec=57/44 lis/c=101/81 les/c/f=102/82/0 sis=103) [2] r=0 lpr=103 pi=[81,103)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:44:41 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 10.1f deep-scrub starts
Nov 23 15:44:41 np0005532763 ceph-osd[78269]: log_channel(cluster) log [DBG] : 10.1f deep-scrub ok
Nov 23 15:44:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:42 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c74000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:42.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:42.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:42 : epoch 6923722b : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:44:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:42 : epoch 6923722b : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:44:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:42 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/204443 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:44:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:43 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:44 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c0016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:44.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:44.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:44:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:44 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c740016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:45 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:45 : epoch 6923722b : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:44:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:46 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c98001ac0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:46 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e105 e105: 3 total, 3 up, 3 in
Nov 23 15:44:46 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 105 pg[10.10( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=105) [2] r=0 lpr=105 pi=[57,105)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:46 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Nov 23 15:44:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:46.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:46.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:46 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c0016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:47 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Nov 23 15:44:47 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e106 e106: 3 total, 3 up, 3 in
Nov 23 15:44:47 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 106 pg[10.10( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=106) [2]/[0] r=-1 lpr=106 pi=[57,106)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:47 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 106 pg[10.10( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=57/57 les/c/f=58/58/0 sis=106) [2]/[0] r=-1 lpr=106 pi=[57,106)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:47 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c740016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/204447 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:44:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:48 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:48 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e107 e107: 3 total, 3 up, 3 in
Nov 23 15:44:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:48.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:48 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Nov 23 15:44:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:48.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:48 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c980025c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:49 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c0016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:49 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Nov 23 15:44:49 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:44:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e108 e108: 3 total, 3 up, 3 in
Nov 23 15:44:49 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 108 pg[10.10( v 50'991 (0'0,50'991] local-lis/les=0/0 n=2 ec=57/44 lis/c=106/57 les/c/f=107/58/0 sis=108) [2] r=0 lpr=108 pi=[57,108)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:49 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 108 pg[10.10( v 50'991 (0'0,50'991] local-lis/les=0/0 n=2 ec=57/44 lis/c=106/57 les/c/f=107/58/0 sis=108) [2] r=0 lpr=108 pi=[57,108)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:44:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:50 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c740016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:50.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:50 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e109 e109: 3 total, 3 up, 3 in
Nov 23 15:44:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:50.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:50 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 109 pg[10.12( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=65/65 les/c/f=66/66/0 sis=109) [2] r=0 lpr=109 pi=[65,109)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:50 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Nov 23 15:44:50 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 109 pg[10.10( v 50'991 (0'0,50'991] local-lis/les=108/109 n=2 ec=57/44 lis/c=106/57 les/c/f=107/58/0 sis=108) [2] r=0 lpr=108 pi=[57,108)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:44:50 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e110 e110: 3 total, 3 up, 3 in
Nov 23 15:44:50 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 110 pg[10.12( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=65/65 les/c/f=66/66/0 sis=110) [2]/[1] r=-1 lpr=110 pi=[65,110)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:50 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 110 pg[10.12( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=65/65 les/c/f=66/66/0 sis=110) [2]/[1] r=-1 lpr=110 pi=[65,110)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:50 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:51 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c980025c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:51 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Nov 23 15:44:51 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e111 e111: 3 total, 3 up, 3 in
Nov 23 15:44:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:52 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:44:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:52.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:44:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:52.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:52 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c74002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:52 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Nov 23 15:44:52 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e112 e112: 3 total, 3 up, 3 in
Nov 23 15:44:52 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 112 pg[10.13( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=112 pruub=14.873973846s) [1] r=-1 lpr=112 pi=[63,112)/1 crt=50'991 mlcod 0'0 active pruub 207.937866211s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:52 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 112 pg[10.13( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=112 pruub=14.873243332s) [1] r=-1 lpr=112 pi=[63,112)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 207.937866211s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:52 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 112 pg[10.12( v 50'991 (0'0,50'991] local-lis/les=0/0 n=4 ec=57/44 lis/c=110/65 les/c/f=111/66/0 sis=112) [2] r=0 lpr=112 pi=[65,112)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:52 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 112 pg[10.12( v 50'991 (0'0,50'991] local-lis/les=0/0 n=4 ec=57/44 lis/c=110/65 les/c/f=111/66/0 sis=112) [2] r=0 lpr=112 pi=[65,112)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:53 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:53 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e113 e113: 3 total, 3 up, 3 in
Nov 23 15:44:53 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 113 pg[10.13( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=113) [1]/[2] r=0 lpr=113 pi=[63,113)/1 crt=50'991 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:53 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 113 pg[10.13( v 50'991 (0'0,50'991] local-lis/les=63/64 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=113) [1]/[2] r=0 lpr=113 pi=[63,113)/1 crt=50'991 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:53 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 113 pg[10.14( v 50'991 (0'0,50'991] local-lis/les=72/73 n=5 ec=57/44 lis/c=72/72 les/c/f=73/73/0 sis=113 pruub=10.166436195s) [1] r=-1 lpr=113 pi=[72,113)/1 crt=50'991 mlcod 0'0 active pruub 204.244766235s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:53 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 113 pg[10.14( v 50'991 (0'0,50'991] local-lis/les=72/73 n=5 ec=57/44 lis/c=72/72 les/c/f=73/73/0 sis=113 pruub=10.166232109s) [1] r=-1 lpr=113 pi=[72,113)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 204.244766235s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:53 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 113 pg[10.12( v 50'991 (0'0,50'991] local-lis/les=112/113 n=4 ec=57/44 lis/c=110/65 les/c/f=111/66/0 sis=112) [2] r=0 lpr=112 pi=[65,112)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:44:53 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Nov 23 15:44:53 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Nov 23 15:44:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:54 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c980025c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:54.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000057s ======
Nov 23 15:44:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:54.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Nov 23 15:44:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:44:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:54 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:54 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Nov 23 15:44:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e114 e114: 3 total, 3 up, 3 in
Nov 23 15:44:54 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 114 pg[10.14( v 50'991 (0'0,50'991] local-lis/les=72/73 n=5 ec=57/44 lis/c=72/72 les/c/f=73/73/0 sis=114) [1]/[2] r=0 lpr=114 pi=[72,114)/1 crt=50'991 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:54 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 114 pg[10.14( v 50'991 (0'0,50'991] local-lis/les=72/73 n=5 ec=57/44 lis/c=72/72 les/c/f=73/73/0 sis=114) [1]/[2] r=0 lpr=114 pi=[72,114)/1 crt=50'991 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:44:54 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 114 pg[10.13( v 50'991 (0'0,50'991] local-lis/les=113/114 n=5 ec=57/44 lis/c=63/63 les/c/f=64/64/0 sis=113) [1]/[2] async=[1] r=0 lpr=113 pi=[63,113)/1 crt=50'991 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:44:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:55 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c74002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:55 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e115 e115: 3 total, 3 up, 3 in
Nov 23 15:44:55 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 115 pg[10.13( v 50'991 (0'0,50'991] local-lis/les=113/114 n=5 ec=57/44 lis/c=113/63 les/c/f=114/64/0 sis=115 pruub=15.359504700s) [1] async=[1] r=-1 lpr=115 pi=[63,115)/1 crt=50'991 mlcod 50'991 active pruub 211.120651245s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:55 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 115 pg[10.13( v 50'991 (0'0,50'991] local-lis/les=113/114 n=5 ec=57/44 lis/c=113/63 les/c/f=114/64/0 sis=115 pruub=15.359414101s) [1] r=-1 lpr=115 pi=[63,115)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 211.120651245s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:55 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 115 pg[10.14( v 50'991 (0'0,50'991] local-lis/les=114/115 n=5 ec=57/44 lis/c=72/72 les/c/f=73/73/0 sis=114) [1]/[2] async=[1] r=0 lpr=114 pi=[72,114)/1 crt=50'991 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:44:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:56 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:44:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:56.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:44:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:56.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:56 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e116 e116: 3 total, 3 up, 3 in
Nov 23 15:44:56 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 116 pg[10.14( v 50'991 (0'0,50'991] local-lis/les=114/115 n=5 ec=57/44 lis/c=114/72 les/c/f=115/73/0 sis=116 pruub=15.003469467s) [1] async=[1] r=-1 lpr=116 pi=[72,116)/1 crt=50'991 mlcod 50'991 active pruub 211.772705078s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:44:56 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 116 pg[10.14( v 50'991 (0'0,50'991] local-lis/les=114/115 n=5 ec=57/44 lis/c=114/72 les/c/f=115/73/0 sis=116 pruub=15.003398895s) [1] r=-1 lpr=116 pi=[72,116)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 211.772705078s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:44:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:56 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c980036c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:57 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:57 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e117 e117: 3 total, 3 up, 3 in
Nov 23 15:44:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:58 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:44:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:44:58.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:44:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:44:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:44:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:44:58.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:44:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:58 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:44:59 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c980036c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:44:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Nov 23 15:44:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:44:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:44:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:44:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e118 e118: 3 total, 3 up, 3 in
Nov 23 15:44:59 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Nov 23 15:45:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:00 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:45:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:00.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:45:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:00.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:00 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Nov 23 15:45:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:00 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:01 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:01 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Nov 23 15:45:01 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e119 e119: 3 total, 3 up, 3 in
Nov 23 15:45:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:02 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c980036c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:02.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:02.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:02 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Nov 23 15:45:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:02 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c980036c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:03 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c980036c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:03 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e120 e120: 3 total, 3 up, 3 in
Nov 23 15:45:03 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Nov 23 15:45:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:04 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:04.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:04.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e120 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:45:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:04 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c74003820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:04 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Nov 23 15:45:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:05 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c74003820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:05 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Nov 23 15:45:05 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e121 e121: 3 total, 3 up, 3 in
Nov 23 15:45:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:06 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c980036c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:06.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:06.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:06 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:06 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Nov 23 15:45:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:07 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c74003820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:07 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Nov 23 15:45:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e122 e122: 3 total, 3 up, 3 in
Nov 23 15:45:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:08 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c74003820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:45:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:08.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:45:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:08.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:08 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c980036c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Nov 23 15:45:08 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e123 e123: 3 total, 3 up, 3 in
Nov 23 15:45:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:09 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:45:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e124 e124: 3 total, 3 up, 3 in
Nov 23 15:45:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:10 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c74003820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:45:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:10.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:45:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:10.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e125 e125: 3 total, 3 up, 3 in
Nov 23 15:45:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:10 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c74003820 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:11 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c980036c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:11 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e126 e126: 3 total, 3 up, 3 in
Nov 23 15:45:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:12 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880037a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:12.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:45:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:12.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:45:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:12 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880037a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:13 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:14 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c980036c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:45:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:14.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:45:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:45:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:14.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:45:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:45:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:14 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c001250 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:15 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880037a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e127 e127: 3 total, 3 up, 3 in
Nov 23 15:45:15 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Nov 23 15:45:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:16 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:16.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:16.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:16 np0005532763 python3.9[91665]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:45:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:16 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c980036c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:16 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Nov 23 15:45:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:17 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c001d70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:17 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Nov 23 15:45:17 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e128 e128: 3 total, 3 up, 3 in
Nov 23 15:45:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:18 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c880037a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:18.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:45:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:18.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:45:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:18 np0005532763 python3.9[91979]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 23 15:45:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:18 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:18 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Nov 23 15:45:18 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e129 e129: 3 total, 3 up, 3 in
Nov 23 15:45:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[90223]: 23/11/2025 20:45:19 : epoch 6923722b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c980036c0 fd 42 proxy ignored for local
Nov 23 15:45:19 np0005532763 kernel: ganesha.nfsd[91365]: segfault at 50 ip 00007f6d4f0a232e sp 00007f6d227fb210 error 4 in libntirpc.so.5.8[7f6d4f087000+2c000] likely on CPU 4 (core 0, socket 4)
Nov 23 15:45:19 np0005532763 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Nov 23 15:45:19 np0005532763 systemd[1]: Started Process Core Dump (PID 92087/UID 0).
Nov 23 15:45:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:19 np0005532763 python3.9[92134]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 23 15:45:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:45:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e130 e130: 3 total, 3 up, 3 in
Nov 23 15:45:20 np0005532763 systemd-coredump[92102]: Process 90230 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 42:#012#0  0x00007f6d4f0a232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Nov 23 15:45:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:20.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:20 np0005532763 systemd[1]: systemd-coredump@2-92087-0.service: Deactivated successfully.
Nov 23 15:45:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:45:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:20.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:45:20 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e131 e131: 3 total, 3 up, 3 in
Nov 23 15:45:20 np0005532763 podman[92293]: 2025-11-23 20:45:20.381986823 +0000 UTC m=+0.037327266 container died ba937a03c234c255b66fb413c02373f0864f4ff39a7110f98ac7cdf05f3554e3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Nov 23 15:45:20 np0005532763 python3.9[92288]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:45:20 np0005532763 systemd[1]: var-lib-containers-storage-overlay-715824f1d1bc139beb6d2e62c6eb1d80a439250a6df66ab9cbab78cff42d6e41-merged.mount: Deactivated successfully.
Nov 23 15:45:20 np0005532763 podman[92293]: 2025-11-23 20:45:20.426768637 +0000 UTC m=+0.082109040 container remove ba937a03c234c255b66fb413c02373f0864f4ff39a7110f98ac7cdf05f3554e3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True)
Nov 23 15:45:20 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Main process exited, code=exited, status=139/n/a
Nov 23 15:45:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:20 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Failed with result 'exit-code'.
Nov 23 15:45:20 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.276s CPU time.
Nov 23 15:45:21 np0005532763 python3.9[92489]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 23 15:45:21 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e132 e132: 3 total, 3 up, 3 in
Nov 23 15:45:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:22.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:45:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:22.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:45:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:22 np0005532763 python3.9[92643]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:45:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:23 np0005532763 python3.9[92795]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:45:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:24.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:45:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:24.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:45:24 np0005532763 python3.9[92874]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:45:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:45:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/204525 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:45:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:25 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Nov 23 15:45:25 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e133 e133: 3 total, 3 up, 3 in
Nov 23 15:45:25 np0005532763 python3.9[93027]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:45:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:45:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:26.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:45:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:26.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:26 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Nov 23 15:45:27 np0005532763 python3.9[93183]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 23 15:45:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:27 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Nov 23 15:45:28 np0005532763 python3.9[93337]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 23 15:45:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:45:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:28.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:45:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:28.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:28 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e134 e134: 3 total, 3 up, 3 in
Nov 23 15:45:28 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Nov 23 15:45:29 np0005532763 python3.9[93491]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 23 15:45:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:45:30 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Nov 23 15:45:30 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e135 e135: 3 total, 3 up, 3 in
Nov 23 15:45:30 np0005532763 python3.9[93644]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 23 15:45:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:30.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 135 pg[10.1e( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=79/79 les/c/f=80/80/0 sis=135) [2] r=0 lpr=135 pi=[79,135)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:45:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:45:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:30.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:45:30 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e136 e136: 3 total, 3 up, 3 in
Nov 23 15:45:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 136 pg[10.1e( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=79/79 les/c/f=80/80/0 sis=136) [2]/[0] r=-1 lpr=136 pi=[79,136)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:45:30 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 136 pg[10.1e( empty local-lis/les=0/0 n=0 ec=57/44 lis/c=79/79 les/c/f=80/80/0 sis=136) [2]/[0] r=-1 lpr=136 pi=[79,136)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Nov 23 15:45:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:30 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Scheduled restart job, restart counter is at 3.
Nov 23 15:45:30 np0005532763 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:45:30 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.276s CPU time.
Nov 23 15:45:30 np0005532763 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:45:31 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Nov 23 15:45:31 np0005532763 podman[93846]: 2025-11-23 20:45:31.193188563 +0000 UTC m=+0.071394707 container create e28fbc6ba0fdd4406bf38781cfbbcee6b5f6bf3434626964fa14eb222d1569b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Nov 23 15:45:31 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/299620c935f64a202c2a323e336566fb64411149883365fed255920042bb21e5/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Nov 23 15:45:31 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/299620c935f64a202c2a323e336566fb64411149883365fed255920042bb21e5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:45:31 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/299620c935f64a202c2a323e336566fb64411149883365fed255920042bb21e5/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:45:31 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/299620c935f64a202c2a323e336566fb64411149883365fed255920042bb21e5/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.dqbktw-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:45:31 np0005532763 podman[93846]: 2025-11-23 20:45:31.160889601 +0000 UTC m=+0.039095825 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:45:31 np0005532763 podman[93846]: 2025-11-23 20:45:31.267761739 +0000 UTC m=+0.145967893 container init e28fbc6ba0fdd4406bf38781cfbbcee6b5f6bf3434626964fa14eb222d1569b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:45:31 np0005532763 podman[93846]: 2025-11-23 20:45:31.275897758 +0000 UTC m=+0.154103902 container start e28fbc6ba0fdd4406bf38781cfbbcee6b5f6bf3434626964fa14eb222d1569b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 23 15:45:31 np0005532763 bash[93846]: e28fbc6ba0fdd4406bf38781cfbbcee6b5f6bf3434626964fa14eb222d1569b3
Nov 23 15:45:31 np0005532763 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:45:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Nov 23 15:45:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Nov 23 15:45:31 np0005532763 python3.9[93842]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:45:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Nov 23 15:45:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Nov 23 15:45:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Nov 23 15:45:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Nov 23 15:45:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Nov 23 15:45:31 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e137 e137: 3 total, 3 up, 3 in
Nov 23 15:45:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:45:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:32.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:32.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:32 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e138 e138: 3 total, 3 up, 3 in
Nov 23 15:45:32 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 138 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=103/104 n=5 ec=57/44 lis/c=103/103 les/c/f=104/104/0 sis=138 pruub=12.893539429s) [0] r=-1 lpr=138 pi=[103,138)/1 crt=50'991 mlcod 0'0 active pruub 245.771652222s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:45:32 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 138 pg[10.1e( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=136/79 les/c/f=137/80/0 sis=138) [2] r=0 lpr=138 pi=[79,138)/1 luod=0'0 crt=50'991 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:45:32 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 138 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=103/104 n=5 ec=57/44 lis/c=103/103 les/c/f=104/104/0 sis=138 pruub=12.893495560s) [0] r=-1 lpr=138 pi=[103,138)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 245.771652222s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:45:32 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 138 pg[10.1e( v 50'991 (0'0,50'991] local-lis/les=0/0 n=5 ec=57/44 lis/c=136/79 les/c/f=137/80/0 sis=138) [2] r=0 lpr=138 pi=[79,138)/1 crt=50'991 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Nov 23 15:45:32 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Nov 23 15:45:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:33 np0005532763 python3.9[94057]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e139 e139: 3 total, 3 up, 3 in
Nov 23 15:45:33 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 139 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=103/104 n=5 ec=57/44 lis/c=103/103 les/c/f=104/104/0 sis=139) [0]/[2] r=0 lpr=139 pi=[103,139)/1 crt=50'991 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:45:33 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 139 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=103/104 n=5 ec=57/44 lis/c=103/103 les/c/f=104/104/0 sis=139) [0]/[2] r=0 lpr=139 pi=[103,139)/1 crt=50'991 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Nov 23 15:45:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:33 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 139 pg[10.1e( v 50'991 (0'0,50'991] local-lis/les=138/139 n=5 ec=57/44 lis/c=136/79 les/c/f=137/80/0 sis=138) [2] r=0 lpr=138 pi=[79,138)/1 crt=50'991 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:45:33.664200) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930733664298, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 3142, "num_deletes": 252, "total_data_size": 10546717, "memory_usage": 10929824, "flush_reason": "Manual Compaction"}
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930733699581, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 6620082, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7582, "largest_seqno": 10719, "table_properties": {"data_size": 6606243, "index_size": 8861, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3909, "raw_key_size": 34783, "raw_average_key_size": 22, "raw_value_size": 6576054, "raw_average_value_size": 4270, "num_data_blocks": 384, "num_entries": 1540, "num_filter_entries": 1540, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930629, "oldest_key_time": 1763930629, "file_creation_time": 1763930733, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 35440 microseconds, and 20971 cpu microseconds.
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:45:33.699644) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 6620082 bytes OK
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:45:33.699672) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:45:33.701381) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:45:33.701409) EVENT_LOG_v1 {"time_micros": 1763930733701402, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:45:33.701432) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 10531575, prev total WAL file size 10531575, number of live WAL files 2.
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:45:33.705657) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(6464KB)], [18(11MB)]
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930733705738, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 18327492, "oldest_snapshot_seqno": -1}
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4075 keys, 13907818 bytes, temperature: kUnknown
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930733783384, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 13907818, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13875213, "index_size": 21295, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10245, "raw_key_size": 104103, "raw_average_key_size": 25, "raw_value_size": 13795255, "raw_average_value_size": 3385, "num_data_blocks": 915, "num_entries": 4075, "num_filter_entries": 4075, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763930733, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:45:33.783640) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 13907818 bytes
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:45:33.785348) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 235.8 rd, 179.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(6.3, 11.2 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(4.9) write-amplify(2.1) OK, records in: 4613, records dropped: 538 output_compression: NoCompression
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:45:33.785378) EVENT_LOG_v1 {"time_micros": 1763930733785365, "job": 8, "event": "compaction_finished", "compaction_time_micros": 77710, "compaction_time_cpu_micros": 52882, "output_level": 6, "num_output_files": 1, "total_output_size": 13907818, "num_input_records": 4613, "num_output_records": 4075, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930733787783, "job": 8, "event": "table_file_deletion", "file_number": 20}
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930733791721, "job": 8, "event": "table_file_deletion", "file_number": 18}
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:45:33.705516) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:45:33.791789) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:45:33.791798) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:45:33.791801) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:45:33.791804) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:45:33 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:45:33.791807) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:45:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:45:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:34.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:45:34 np0005532763 python3.9[94210]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:45:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:45:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:34.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:45:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e140 e140: 3 total, 3 up, 3 in
Nov 23 15:45:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:45:34 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 140 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=139/140 n=5 ec=57/44 lis/c=103/103 les/c/f=104/104/0 sis=139) [0]/[2] async=[0] r=0 lpr=139 pi=[103,139)/1 crt=50'991 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Nov 23 15:45:34 np0005532763 python3.9[94289]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:45:35 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e141 e141: 3 total, 3 up, 3 in
Nov 23 15:45:35 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 141 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=139/140 n=5 ec=57/44 lis/c=139/103 les/c/f=140/104/0 sis=141 pruub=15.356277466s) [0] async=[0] r=-1 lpr=141 pi=[103,141)/1 crt=50'991 mlcod 50'991 active pruub 251.150253296s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Nov 23 15:45:35 np0005532763 ceph-osd[78269]: osd.2 pg_epoch: 141 pg[10.1f( v 50'991 (0'0,50'991] local-lis/les=139/140 n=5 ec=57/44 lis/c=139/103 les/c/f=140/104/0 sis=141 pruub=15.356187820s) [0] r=-1 lpr=141 pi=[103,141)/1 crt=50'991 mlcod 0'0 unknown NOTIFY pruub 251.150253296s@ mbc={}] state<Start>: transitioning to Stray
Nov 23 15:45:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:35 np0005532763 python3.9[94441]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:45:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:36.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:36 np0005532763 python3.9[94520]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:45:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:45:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:36.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:45:36 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 e142: 3 total, 3 up, 3 in
Nov 23 15:45:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:37 np0005532763 python3.9[94698]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:45:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:37 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:45:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:37 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:45:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:38.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:38.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:45:40 np0005532763 python3.9[94934]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:45:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:40.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:40.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:40 np0005532763 python3.9[95087]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 23 15:45:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:41 np0005532763 python3.9[95238]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:45:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:42.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:42.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:42 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:45:42 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:45:43 np0005532763 python3.9[95391]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:45:43 np0005532763 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 23 15:45:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:43 np0005532763 systemd[1]: tuned.service: Deactivated successfully.
Nov 23 15:45:43 np0005532763 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 23 15:45:43 np0005532763 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 23 15:45:43 np0005532763 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 23 15:45:43 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:45:43 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:45:43 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:45:43 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:45:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:45:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:44.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:45:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:44.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:45:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:44 np0005532763 python3.9[95566]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 23 15:45:44 np0005532763 ceph-mon[75752]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s))
Nov 23 15:45:44 np0005532763 ceph-mon[75752]: Cluster is now healthy
Nov 23 15:45:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:45 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:46 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:46.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:46.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:46 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/204547 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:45:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:47 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:48 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:48.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:45:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:48.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:45:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:48 np0005532763 python3.9[95725]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:45:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:48 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80000016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:49 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:49 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:45:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:45:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:50 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:50.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:45:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:50.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:45:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:50 np0005532763 python3.9[95881]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:45:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:50 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:51 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:51 np0005532763 systemd-logind[830]: Session 38 logged out. Waiting for processes to exit.
Nov 23 15:45:51 np0005532763 systemd[1]: session-38.scope: Deactivated successfully.
Nov 23 15:45:51 np0005532763 systemd[1]: session-38.scope: Consumed 1min 7.801s CPU time.
Nov 23 15:45:51 np0005532763 systemd-logind[830]: Removed session 38.
Nov 23 15:45:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:52 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80000016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:52.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:52.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:52 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:53 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80240089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:53 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:45:53 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:45:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:54 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:54.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:45:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:54.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:45:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:45:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:54 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80000016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:55 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:56 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80240089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:56.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:45:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:56.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:45:56 np0005532763 systemd-logind[830]: New session 39 of user zuul.
Nov 23 15:45:56 np0005532763 systemd[1]: Started Session 39 of User zuul.
Nov 23 15:45:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:56 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80240089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:57 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80240089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:57 np0005532763 python3.9[96117]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:45:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:58 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80240089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:45:58.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:45:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:45:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:45:58.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:45:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:58 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:58 np0005532763 python3.9[96275]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 23 15:45:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:45:59 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:45:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:45:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:45:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:45:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:46:00 np0005532763 python3.9[96429]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:46:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:00 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:00.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:00.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:00 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802400a020 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:00 np0005532763 python3.9[96514]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 23 15:46:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:01 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:02 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:02.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:02.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:02 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:03 np0005532763 python3.9[96669]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:46:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:03 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802400a020 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:04 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:04.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:46:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:04.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:46:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:46:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:04 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:05 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:05 np0005532763 python3.9[96824]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 15:46:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:06 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802400a020 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:06.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:06.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:06 np0005532763 python3.9[96979]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:46:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:06 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:07 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:07 np0005532763 python3.9[97131]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 23 15:46:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:08 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:08.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:46:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:08.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:46:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:09 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802400a1c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:09 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:09 np0005532763 python3.9[97283]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:46:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:46:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:10 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:46:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:10.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:46:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000057s ======
Nov 23 15:46:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:10.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Nov 23 15:46:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:10 np0005532763 python3.9[97443]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:46:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:11 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:11 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802400a1c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:12 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:46:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:12.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:46:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:46:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:12.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:46:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:12 np0005532763 python3.9[97598]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:46:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:13 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:13 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:14 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802400a1c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:14.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:46:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:14.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:46:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:46:14 np0005532763 python3.9[97887]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 15:46:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:15 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:15 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003cc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:15 np0005532763 python3.9[98037]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:46:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:16 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:16.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:16.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:16 np0005532763 python3.9[98196]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:46:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:17 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:17 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:18 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:18.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:46:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:18.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:46:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:18 np0005532763 python3.9[98376]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:46:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:19 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4000d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:19 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:46:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:20 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:20.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:46:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:20.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:46:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:20 np0005532763 python3.9[98531]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:46:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:21 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:21 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:21 np0005532763 python3.9[98686]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Nov 23 15:46:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:22 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:22.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:22.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:23 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:23 np0005532763 systemd[1]: session-39.scope: Deactivated successfully.
Nov 23 15:46:23 np0005532763 systemd[1]: session-39.scope: Consumed 20.259s CPU time.
Nov 23 15:46:23 np0005532763 systemd-logind[830]: Session 39 logged out. Waiting for processes to exit.
Nov 23 15:46:23 np0005532763 systemd-logind[830]: Removed session 39.
Nov 23 15:46:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:23 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff40018b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:24 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:24.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:24.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:46:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:25 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:25 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:26 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff40021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:26.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:26.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:27 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:27 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:28 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:28.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:46:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:28.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:46:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:28 np0005532763 systemd-logind[830]: New session 40 of user zuul.
Nov 23 15:46:28 np0005532763 systemd[1]: Started Session 40 of User zuul.
Nov 23 15:46:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:29 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff40021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:29 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:46:29 np0005532763 python3.9[98871]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:46:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:30 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:30.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:46:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:30.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:46:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:30 np0005532763 python3.9[99027]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:46:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:32 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:32 np0005532763 python3.9[99221]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:46:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:46:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:32.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:46:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:46:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:32.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:46:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:32 np0005532763 systemd[1]: session-40.scope: Deactivated successfully.
Nov 23 15:46:32 np0005532763 systemd[1]: session-40.scope: Consumed 2.925s CPU time.
Nov 23 15:46:32 np0005532763 systemd-logind[830]: Session 40 logged out. Waiting for processes to exit.
Nov 23 15:46:32 np0005532763 systemd-logind[830]: Removed session 40.
Nov 23 15:46:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:33 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:33 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:34 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4002370 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:34.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:34.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:46:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:35 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:35 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:36.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:36 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:36.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:37 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4002370 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:37 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4002370 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:37 np0005532763 systemd-logind[830]: New session 41 of user zuul.
Nov 23 15:46:37 np0005532763 systemd[1]: Started Session 41 of User zuul.
Nov 23 15:46:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/204637 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:46:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:38.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:38 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:38.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:38 np0005532763 python3.9[99431]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:46:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:39 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:39 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:46:39 np0005532763 python3.9[99586]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:46:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:46:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:40.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:46:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:40 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:46:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:40.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:46:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:40 np0005532763 python3.9[99744]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:46:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:41 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:41 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:41 np0005532763 python3.9[99828]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:46:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:42.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:42 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:42.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:42 np0005532763 systemd[1]: session-19.scope: Deactivated successfully.
Nov 23 15:46:42 np0005532763 systemd[1]: session-19.scope: Consumed 10.880s CPU time.
Nov 23 15:46:42 np0005532763 systemd-logind[830]: Session 19 logged out. Waiting for processes to exit.
Nov 23 15:46:42 np0005532763 systemd-logind[830]: Removed session 19.
Nov 23 15:46:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:43 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:43 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:43 np0005532763 python3.9[99983]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:46:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:44.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:44.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:46:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:45 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:45 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:45 np0005532763 python3.9[100180]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:46:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:46 np0005532763 python3.9[100333]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:46:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:46:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:46.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:46:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:46 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:46.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:47 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:47 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:47 np0005532763 python3.9[100499]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:46:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:47 : epoch 6923726b : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:46:47 np0005532763 python3.9[100581]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:46:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:46:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:48.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:46:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:48 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:46:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:48.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:46:48 np0005532763 python3.9[100734]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:46:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:49 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:49 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:49 np0005532763 python3.9[100812]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:46:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:46:50 np0005532763 python3.9[100965]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:46:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:46:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:50.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:46:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:50 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:50.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:50 : epoch 6923726b : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:46:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:50 : epoch 6923726b : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:46:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:51 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024001320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:51 np0005532763 python3.9[101118]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:46:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:51 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:51 np0005532763 python3.9[101271]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:46:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:52.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:52 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:52.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:52 np0005532763 python3.9[101424]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:46:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:53 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:53 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024001320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:53 np0005532763 python3.9[101640]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:46:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:53 : epoch 6923726b : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:46:54 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:46:54 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:46:54 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:46:54 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:46:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:54.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:54 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:54.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:46:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:55 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:55 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:56 np0005532763 python3.9[101813]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:46:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:56.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:56 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024001320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:56.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:56 np0005532763 python3.9[101969]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:46:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:57 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:57 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:57 np0005532763 python3.9[102147]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:46:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:46:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:46:58.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:46:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:58 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:46:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:46:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:46:58.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:46:58 np0005532763 python3.9[102325]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:46:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:59 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024001320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:59 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:46:59 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:46:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:46:59 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:46:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:46:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:46:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:46:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:46:59 np0005532763 python3.9[102479]: ansible-service_facts Invoked
Nov 23 15:46:59 np0005532763 network[102496]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 15:46:59 np0005532763 network[102497]: 'network-scripts' will be removed from distribution in near future.
Nov 23 15:46:59 np0005532763 network[102498]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 15:47:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/204700 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:47:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:00.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:00 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000030s ======
Nov 23 15:47:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:00.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Nov 23 15:47:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:01 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:01 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024001320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:02.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:02 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000030s ======
Nov 23 15:47:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:02.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Nov 23 15:47:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:03 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:03 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:04.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:04 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024001320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:04.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:47:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:05 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:05 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:06 np0005532763 python3.9[102956]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:47:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000030s ======
Nov 23 15:47:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:06.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Nov 23 15:47:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:06 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:06.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:07 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024001320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:07 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000030s ======
Nov 23 15:47:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:08.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Nov 23 15:47:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:08 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:08.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:08 np0005532763 python3.9[103112]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 23 15:47:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:09 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:09 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024001320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:47:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:10.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:10 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:47:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:10.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:47:11 np0005532763 python3.9[103266]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:47:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:11 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:11 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:11 np0005532763 python3.9[103344]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:47:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:12.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:12 np0005532763 python3.9[103498]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:47:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:12 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024001320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:47:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:12.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:47:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:13 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:13 np0005532763 python3.9[103576]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:47:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:13 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:14.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:47:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:14 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:14.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:14 np0005532763 python3.9[103731]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:47:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:15 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024001320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:15 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:16.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:16 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:47:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:16.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:47:16 np0005532763 python3.9[103885]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:47:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:17 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80200041f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:17 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024001320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:17 np0005532763 python3.9[103994]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:47:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:47:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:18.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:47:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:18 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000030s ======
Nov 23 15:47:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:18.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Nov 23 15:47:18 np0005532763 systemd-logind[830]: Session 41 logged out. Waiting for processes to exit.
Nov 23 15:47:18 np0005532763 systemd[1]: session-41.scope: Deactivated successfully.
Nov 23 15:47:18 np0005532763 systemd[1]: session-41.scope: Consumed 29.003s CPU time.
Nov 23 15:47:18 np0005532763 systemd-logind[830]: Removed session 41.
Nov 23 15:47:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:19 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:19 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:47:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:20.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:20 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:20.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:21 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018001220 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:21 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:47:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:22.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:47:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:22 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:22.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:23 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:23 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002030 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:24 np0005532763 systemd-logind[830]: New session 42 of user zuul.
Nov 23 15:47:24 np0005532763 systemd[1]: Started Session 42 of User zuul.
Nov 23 15:47:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:24.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:47:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:24 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:47:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:24.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:47:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:25 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:25 np0005532763 python3.9[104186]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:47:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:25 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:25 np0005532763 python3.9[104339]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:47:26 np0005532763 python3.9[104417]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:47:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:47:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:26.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:47:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:26 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002030 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:26.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:26 np0005532763 systemd[1]: session-42.scope: Deactivated successfully.
Nov 23 15:47:26 np0005532763 systemd[1]: session-42.scope: Consumed 1.913s CPU time.
Nov 23 15:47:26 np0005532763 systemd-logind[830]: Session 42 logged out. Waiting for processes to exit.
Nov 23 15:47:26 np0005532763 systemd-logind[830]: Removed session 42.
Nov 23 15:47:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:27 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:27 np0005532763 systemd[82270]: Created slice User Background Tasks Slice.
Nov 23 15:47:27 np0005532763 systemd[82270]: Starting Cleanup of User's Temporary Files and Directories...
Nov 23 15:47:27 np0005532763 systemd[82270]: Finished Cleanup of User's Temporary Files and Directories.
Nov 23 15:47:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:27 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:28.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:28 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:47:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:28.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:47:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:29 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002030 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:29 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:47:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:30.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:30 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:47:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:30.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:47:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003130 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:32 np0005532763 systemd-logind[830]: New session 43 of user zuul.
Nov 23 15:47:32 np0005532763 systemd[1]: Started Session 43 of User zuul.
Nov 23 15:47:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000030s ======
Nov 23 15:47:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:32.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Nov 23 15:47:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:32 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:47:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:32.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:47:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:33 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:33 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:33 np0005532763 python3.9[104603]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:47:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:34.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:34 np0005532763 python3.9[104761]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:47:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:47:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:34 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003130 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000030s ======
Nov 23 15:47:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:34.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Nov 23 15:47:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:35 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:35 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc003f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:35 np0005532763 python3.9[104936]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:47:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:35 np0005532763 python3.9[105015]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.41kmvjhi recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:47:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:36.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:36 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:36.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:37 np0005532763 python3.9[105168]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:47:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:37 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003130 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:37 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003130 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:37 np0005532763 python3.9[105271]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.ohuegfns recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:47:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:47:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:38.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:47:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:38 np0005532763 python3.9[105425]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:47:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:38 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003130 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:38.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:39 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003c90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:39 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:39 np0005532763 python3.9[105577]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:47:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:47:39 np0005532763 python3.9[105656]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:47:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000030s ======
Nov 23 15:47:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:40.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Nov 23 15:47:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:40 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:40 np0005532763 python3.9[105809]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:47:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:40.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:41 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:41 np0005532763 python3.9[105887]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:47:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:41 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:42 np0005532763 python3.9[106040]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:47:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:42.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:42 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:42.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:42 np0005532763 python3.9[106193]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:47:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:43 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003130 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:43 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc003f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:43 np0005532763 python3.9[106271]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:47:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:44 np0005532763 python3.9[106424]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:47:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000030s ======
Nov 23 15:47:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:44.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Nov 23 15:47:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:47:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003cd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:44.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:44 np0005532763 python3.9[106503]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:47:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:45 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4003fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:45 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180045c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:46 np0005532763 python3.9[106656]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:47:46 np0005532763 systemd[1]: Reloading.
Nov 23 15:47:46 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:47:46 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:47:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:46.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:46 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc003f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:46.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:47 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:47 np0005532763 python3.9[106845]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:47:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:47 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:47 np0005532763 python3.9[106923]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:47:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000030s ======
Nov 23 15:47:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:48.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Nov 23 15:47:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:48 np0005532763 python3.9[107077]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:47:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:48 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180045c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:48.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:49 np0005532763 python3.9[107156]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:47:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:49 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc003f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:49 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:47:50 np0005532763 python3.9[107309]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:47:50 np0005532763 systemd[1]: Reloading.
Nov 23 15:47:50 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:47:50 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:47:50 np0005532763 systemd[1]: Starting Create netns directory...
Nov 23 15:47:50 np0005532763 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 15:47:50 np0005532763 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 15:47:50 np0005532763 systemd[1]: Finished Create netns directory.
Nov 23 15:47:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:47:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:50.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:47:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:50 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:50.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:51 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180045c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:51 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:51 np0005532763 python3.9[107504]: ansible-ansible.builtin.service_facts Invoked
Nov 23 15:47:51 np0005532763 network[107521]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 15:47:51 np0005532763 network[107522]: 'network-scripts' will be removed from distribution in near future.
Nov 23 15:47:51 np0005532763 network[107523]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 15:47:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:47:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:52.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:47:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:52 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:52.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:53 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:53 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180045c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:54.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:47:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:54 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180045c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:54.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:55 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020001ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:55 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:47:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:56.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:47:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:56 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:47:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:56.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:47:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:57 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180045c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:57 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:47:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:47:58.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:47:58 np0005532763 python3.9[107818]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:47:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:58 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:47:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:47:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:47:58.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:47:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:59 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:59 np0005532763 python3.9[107958]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:47:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:47:59 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180045c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:47:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:47:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:47:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:47:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:47:59 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:47:59 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:47:59 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:47:59 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:48:00 np0005532763 python3.9[108128]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:48:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:00.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:48:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:00 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:00.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:00 np0005532763 python3.9[108281]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:48:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:01 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:01 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:01 np0005532763 python3.9[108359]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:02.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:02 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180045c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:02.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:02 np0005532763 python3.9[108513]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 23 15:48:02 np0005532763 systemd[1]: Starting Time & Date Service...
Nov 23 15:48:02 np0005532763 systemd[1]: Started Time & Date Service.
Nov 23 15:48:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:03 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180045c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:03 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003cf0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:03 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:48:03 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:48:03 np0005532763 python3.9[108693]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:04.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:48:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:04 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:04.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:04 np0005532763 python3.9[108848]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:48:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:05 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:05 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180045c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:05 np0005532763 python3.9[108926]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:06 np0005532763 python3.9[109079]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:48:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:06.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:06 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003d10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:48:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:06.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:48:06 np0005532763 python3.9[109158]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.xtwg447h recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:07 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:07 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:07 np0005532763 python3.9[109310]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:48:08 np0005532763 python3.9[109389]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:08.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:08 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180045c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:08.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:09 np0005532763 python3.9[109542]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:48:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:09 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003d30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:09 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:48:10 np0005532763 python3[109696]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 23 15:48:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:48:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:10.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:48:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:10 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:10.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:11 np0005532763 python3.9[109849]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:48:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:11 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180045c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:11 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:11 np0005532763 python3.9[109927]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:12 np0005532763 python3.9[110080]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:48:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:12.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:12 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:12.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:13 np0005532763 python3.9[110159]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:13 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:13 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180045c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:13 np0005532763 python3.9[110312]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:48:14 np0005532763 python3.9[110391]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:14.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:48:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:14 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:14.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:15 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:15 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:15 np0005532763 python3.9[110543]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:48:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:15 np0005532763 python3.9[110622]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:48:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:16.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:48:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:16 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180045c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:16.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:16 np0005532763 python3.9[110775]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:48:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:17 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:17 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:17 np0005532763 python3.9[110853]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:18 np0005532763 python3.9[111031]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:48:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:18.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:18 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:18.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:19 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180045c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:19 np0005532763 python3.9[111187]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:19 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:48:20 np0005532763 python3.9[111340]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:48:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:20.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:48:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:20 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:20.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:21 np0005532763 python3.9[111494]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:21 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:21 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:22 np0005532763 python3.9[111647]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 23 15:48:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:48:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:22.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:48:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:22 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:48:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:22.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:48:22 np0005532763 python3.9[111801]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 23 15:48:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:23 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:23 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:23 np0005532763 systemd[1]: session-43.scope: Deactivated successfully.
Nov 23 15:48:23 np0005532763 systemd[1]: session-43.scope: Consumed 37.799s CPU time.
Nov 23 15:48:23 np0005532763 systemd-logind[830]: Session 43 logged out. Waiting for processes to exit.
Nov 23 15:48:23 np0005532763 systemd-logind[830]: Removed session 43.
Nov 23 15:48:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:24.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:48:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:24 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:24.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:25 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:25 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:26.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:26 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:48:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:26.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:48:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:27 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:27 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:28.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:28 np0005532763 systemd-logind[830]: New session 44 of user zuul.
Nov 23 15:48:28 np0005532763 systemd[1]: Started Session 44 of User zuul.
Nov 23 15:48:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:28 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:28.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:29 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:29 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:29 np0005532763 python3.9[111987]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 23 15:48:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:48:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:30.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:30 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:30 np0005532763 python3.9[112141]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:48:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:30.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:31 np0005532763 python3.9[112295]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Nov 23 15:48:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:32 np0005532763 python3.9[112449]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.8kuaxr4i follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:48:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:32.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:32 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:32.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:33 np0005532763 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 23 15:48:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:33 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:33 np0005532763 python3.9[112574]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.8kuaxr4i mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763930911.8866613-104-83523969867064/.source.8kuaxr4i _original_basename=.xrx6dqp5 follow=False checksum=6cd7b37efcd593debc42fa9bb68a32d60f10fcfa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:33 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:34 np0005532763 python3.9[112729]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:48:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:34.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:48:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:34 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:48:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:34.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:48:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/204835 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:48:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:35 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:35 np0005532763 python3.9[112882]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCZyfELJX7KkP8E4Yo+r9guKNy64TSJDfB+rBUAclCyKwGxjxhBTRAJJCOL6kSBIkbUub9LTNVh+s271jrKlK1rYs22c1DFe3ci9hBERauX4lIaBHw9kJBHURb9cB+VbonXf0hAdqGDLTXdqFnbed2oU0ngSuVesO/C9+SCSZFsfERuUe3/SXKbWfjehgYTi4GquXo6Ynq1HopME6mRR8qGsv6sgdkxpSaUiwtSBG5ONOSyzrev1t2hdDsRxvbZAZgV2ab6IMD9DTKaIXphHpumL6txas+nKViUfm+gW6p6EKNdHb/VLha7ghY3p4LE3OdXM4eytxszF0Fzs/0CXzafNxHjVjHzqxrJBi/PT22i6QD60NTimabHulw8IkZG6KsuNVq1rmlSSGQGjqAs7l6hNH8kF4uq1JwOl6mVgct5iE+ZzhfO5WRWShiE1LlCZpqdYE9VqmBrK5r70N0srW3h2mb4lTAwvC089Vert64D29M7riepyGCrGInpE4aK7Sk=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIFop+sR8mOkxOfCCMKg8Voa+6Ns0zHMRLKg+WdnL56v#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFQ0Rj0/OjRh0AQLkOX0VueFFf3xD5FqSzewSN/8R0Xh0Ybf7bkNUGszKaTkKSUBKR2e9V/GwA+BxEChWtzU3sY=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCrfRiqah4FSYlin2mt3PYchMDfWNjxPXqcCCW7iymA93OXZ1reX9dxsJRSssuxIkwaYv7OC+wrUmMOsDhULhy9uNDku8TnHodZVNms8z3UwQW2GPePqEdQ56rKSJ5DhpY0ly7PapOQ69jitmBGQjsu8go19hV3djXlFm1du9V1HMnfGqyr5REZ5ACjW2Rr0108gdYgrt/xh+1sl7cgixK0vUKaqN47/VJHXSTk20aXknt5lhurSKMbRD4cgP1pz0lBJ8LfEvFajLlXBk7MtsI8L94qtHH20hWUk8P2FmqsM4LoLIY4YkAT6kzDPkNdC5F3bpl67NzNXKLdStChVsjRVgrsR0JhU4YO8nYPSqn85KWQUMsuQhXfeMPb5a0n4vSmF0hQhaTctIIK5Yq+qK3S5Ee0tV+ZLMcrYiRfVJYjULh+8LazeUYBtZAVkOoenlHNpcxfVl2v8Fx37PYu6wY/1Ol7i+Fyg+DMculPNu0E00hYIfuSPW06sm98V0zJ7bs=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIC0+oolG6Djq6MTp/HXh3SEc2a8aDRu5q8AnCiNHx/fN#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBC1GCZqvti/wHDh2Oo7NSAFToY/dykBAXL2bgJmg9kqKO2qTzfIYtCRiGP/x9yaw+D3ymaftMgdHgFkzRtYcXz0=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCo3+sqhh74Wal6wWv19BRNHNnjTPYKculYCUftHSfYmbg5LryLTnsWAJdalXVBYQIJtq5uFrJRBG4C0R1XMU/MT4ZxuTtafwAzeTnKoCHbN/+mH31bndpvGKYRQ9AQHmamquyDQaSEjIYKFaK6eM7uVV/PaSZqasrB6awv3MeDH/GhtlyJwY7ble8M3UtG9jMWuPq/qX+TnKCZI3COyKBCe7F3aeaIewsho+T7qsRd8UNr55SHWJ1N6xYtA4FUayJ4cCZUeo4+SOJuQWb6A3HZm75y0LpdLDFH54DqyDqKVvDUfaKJJQV++3GT9kF9+jrwJDEK9VslSlEylLZ0zg1J0Z2zyMOwOAxBKEUXQNymC+00ybwJd4trP7KDy6+ZGOtHEThBgVO6vtuxQLWhseNa3otNXh7cHTf+Jfo7uo1wHbasd6aD1AVxvt4yKgOGy1ypt9Ps/COlbfHHFYZsI5gVLyJyK8aeipUjJUe6u6Qlf/F/inV1rwRBg8li7oeW7Ss=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFE96kcIFDgsK09K4ZL9HihPRGUmf4YDgXlXqtYy0M8r#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJoWf98fFp9mmY0S22K7n+FjL7cDYCGLm8eglORId7ZBFp9PG5e8P+ws6VWjBbceNazmskqBYurrlrsvB4Mu40E=#012 create=True mode=0644 path=/tmp/ansible.8kuaxr4i state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:35 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:36 np0005532763 python3.9[113035]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.8kuaxr4i' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:48:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:36.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:36 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:36.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:37 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:37 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:37 np0005532763 python3.9[113216]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.8kuaxr4i state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:38 np0005532763 systemd[1]: session-44.scope: Deactivated successfully.
Nov 23 15:48:38 np0005532763 systemd[1]: session-44.scope: Consumed 6.500s CPU time.
Nov 23 15:48:38 np0005532763 systemd-logind[830]: Session 44 logged out. Waiting for processes to exit.
Nov 23 15:48:38 np0005532763 systemd-logind[830]: Removed session 44.
Nov 23 15:48:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:38.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:38 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:48:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:38.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:48:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:39 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:39 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:48:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:40.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:40 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:48:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:40.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:48:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:41 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:41 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:42.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:42 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:48:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:42.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:48:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:43 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:43 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:43 np0005532763 systemd-logind[830]: New session 45 of user zuul.
Nov 23 15:48:43 np0005532763 systemd[1]: Started Session 45 of User zuul.
Nov 23 15:48:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:48:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:44.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:48:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:48:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:48:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:44.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:45 np0005532763 python3.9[113402]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:48:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:45 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:45 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:48:45.432076) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930925432471, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 2105, "num_deletes": 251, "total_data_size": 6065391, "memory_usage": 6152056, "flush_reason": "Manual Compaction"}
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930925453079, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 2473480, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10724, "largest_seqno": 12824, "table_properties": {"data_size": 2467201, "index_size": 3222, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15622, "raw_average_key_size": 20, "raw_value_size": 2453502, "raw_average_value_size": 3178, "num_data_blocks": 143, "num_entries": 772, "num_filter_entries": 772, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930734, "oldest_key_time": 1763930734, "file_creation_time": 1763930925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 20786 microseconds, and 11092 cpu microseconds.
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:48:45.453164) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 2473480 bytes OK
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:48:45.453205) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:48:45.455479) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:48:45.455512) EVENT_LOG_v1 {"time_micros": 1763930925455502, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:48:45.455545) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 6056177, prev total WAL file size 6056177, number of live WAL files 2.
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:48:45.458648) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323532' seq:0, type:0; will stop at (end)
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(2415KB)], [21(13MB)]
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930925458736, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 16381298, "oldest_snapshot_seqno": -1}
Nov 23 15:48:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4414 keys, 14652600 bytes, temperature: kUnknown
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930925545925, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 14652600, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14618879, "index_size": 21579, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11077, "raw_key_size": 111419, "raw_average_key_size": 25, "raw_value_size": 14534130, "raw_average_value_size": 3292, "num_data_blocks": 926, "num_entries": 4414, "num_filter_entries": 4414, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763930925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:48:45.546255) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 14652600 bytes
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:48:45.548086) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 187.7 rd, 167.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 13.3 +0.0 blob) out(14.0 +0.0 blob), read-write-amplify(12.5) write-amplify(5.9) OK, records in: 4847, records dropped: 433 output_compression: NoCompression
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:48:45.548120) EVENT_LOG_v1 {"time_micros": 1763930925548103, "job": 10, "event": "compaction_finished", "compaction_time_micros": 87278, "compaction_time_cpu_micros": 50635, "output_level": 6, "num_output_files": 1, "total_output_size": 14652600, "num_input_records": 4847, "num_output_records": 4414, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 15:48:45 np0005532763 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930925549060, "job": 10, "event": "table_file_deletion", "file_number": 23}
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930925554041, "job": 10, "event": "table_file_deletion", "file_number": 21}
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:48:45.458525) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:48:45.554100) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:48:45.554107) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:48:45.554112) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:48:45.554115) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:48:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:48:45.554118) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:48:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:46 np0005532763 python3.9[113560]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 23 15:48:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:46.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:46 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:46.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:47 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:47 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:47 : epoch 6923726b : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:48:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:47 : epoch 6923726b : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:48:48 np0005532763 python3.9[113716]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:48:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:48.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:48 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:48.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:49 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:49 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:49 np0005532763 python3.9[113870]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:48:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:48:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:50 np0005532763 python3.9[114024]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:48:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:48:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:50.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:48:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:50 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:50 : epoch 6923726b : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:48:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:48:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:50.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:48:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:51 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:51 np0005532763 python3.9[114177]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:48:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:51 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:51 np0005532763 systemd[1]: session-45.scope: Deactivated successfully.
Nov 23 15:48:51 np0005532763 systemd[1]: session-45.scope: Consumed 4.889s CPU time.
Nov 23 15:48:51 np0005532763 systemd-logind[830]: Session 45 logged out. Waiting for processes to exit.
Nov 23 15:48:51 np0005532763 systemd-logind[830]: Removed session 45.
Nov 23 15:48:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:52.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:52 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:52.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:53 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:53 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:48:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:54.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:48:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:48:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:54 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:48:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:54.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:48:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:55 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:55 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:48:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:56.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:48:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:56 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:56.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:56 np0005532763 systemd-logind[830]: New session 46 of user zuul.
Nov 23 15:48:56 np0005532763 systemd[1]: Started Session 46 of User zuul.
Nov 23 15:48:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:57 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/204857 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:48:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:57 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:58 np0005532763 python3.9[114389]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:48:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:48:58.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:58 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:48:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:48:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:48:58.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:48:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:59 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:48:59 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:48:59 np0005532763 python3.9[114546]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:48:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:48:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:48:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:48:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:48:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:00 np0005532763 python3.9[114631]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 23 15:49:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:00.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:00 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:49:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:00.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:49:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:01 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:01 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002100 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:02 np0005532763 python3.9[114784]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:49:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:02.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:02 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:02.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:03 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:03 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:03 np0005532763 python3.9[114937]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 23 15:49:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:49:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:04.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:49:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:49:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:04 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024002120 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:04 np0005532763 python3.9[115169]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:49:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:04.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:04 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:49:04 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:49:04 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:49:04 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:49:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:05 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:05 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:05 np0005532763 python3.9[115319]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:49:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:06 np0005532763 systemd-logind[830]: Session 46 logged out. Waiting for processes to exit.
Nov 23 15:49:06 np0005532763 systemd[1]: session-46.scope: Deactivated successfully.
Nov 23 15:49:06 np0005532763 systemd[1]: session-46.scope: Consumed 6.873s CPU time.
Nov 23 15:49:06 np0005532763 systemd-logind[830]: Removed session 46.
Nov 23 15:49:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:06.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/204906 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:49:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:06 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:06.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:07 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802400a9a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:07 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:08.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:08 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:08.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:09 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:09 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:49:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:49:09.845780) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930949845818, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 511, "num_deletes": 251, "total_data_size": 771857, "memory_usage": 782840, "flush_reason": "Manual Compaction"}
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930949851213, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 509525, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12829, "largest_seqno": 13335, "table_properties": {"data_size": 506826, "index_size": 735, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6430, "raw_average_key_size": 18, "raw_value_size": 501364, "raw_average_value_size": 1440, "num_data_blocks": 32, "num_entries": 348, "num_filter_entries": 348, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930926, "oldest_key_time": 1763930926, "file_creation_time": 1763930949, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 5517 microseconds, and 2989 cpu microseconds.
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:49:09.851297) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 509525 bytes OK
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:49:09.851318) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:49:09.852712) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:49:09.852733) EVENT_LOG_v1 {"time_micros": 1763930949852726, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:49:09.852753) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 768833, prev total WAL file size 768833, number of live WAL files 2.
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:49:09.853777) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(497KB)], [24(13MB)]
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930949853833, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 15162125, "oldest_snapshot_seqno": -1}
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4247 keys, 13435004 bytes, temperature: kUnknown
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930949926465, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 13435004, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13404043, "index_size": 19267, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10629, "raw_key_size": 108863, "raw_average_key_size": 25, "raw_value_size": 13323814, "raw_average_value_size": 3137, "num_data_blocks": 815, "num_entries": 4247, "num_filter_entries": 4247, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763930949, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:49:09.926717) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 13435004 bytes
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:49:09.928079) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 208.5 rd, 184.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 14.0 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(56.1) write-amplify(26.4) OK, records in: 4762, records dropped: 515 output_compression: NoCompression
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:49:09.928102) EVENT_LOG_v1 {"time_micros": 1763930949928091, "job": 12, "event": "compaction_finished", "compaction_time_micros": 72706, "compaction_time_cpu_micros": 49696, "output_level": 6, "num_output_files": 1, "total_output_size": 13435004, "num_input_records": 4762, "num_output_records": 4247, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930949928350, "job": 12, "event": "table_file_deletion", "file_number": 26}
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763930949931889, "job": 12, "event": "table_file_deletion", "file_number": 24}
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:49:09.853504) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:49:09.932018) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:49:09.932023) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:49:09.932024) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:49:09.932026) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:49:09 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:49:09.932027) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:49:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:49:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:10.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:49:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:10 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:10.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:11 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:11 np0005532763 systemd-logind[830]: New session 47 of user zuul.
Nov 23 15:49:11 np0005532763 systemd[1]: Started Session 47 of User zuul.
Nov 23 15:49:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:11 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:12 np0005532763 python3.9[115529]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:49:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:12.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:12 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc004340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:12.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:13 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:13 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:14 np0005532763 python3.9[115689]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:49:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:14.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:49:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:14 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:49:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:14.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:49:14 np0005532763 python3.9[115842]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:49:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:15 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80000010b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:15 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:15 np0005532763 python3.9[115995]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:15 : epoch 6923726b : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:49:16 np0005532763 python3.9[116118]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930955.1996832-159-85863491031106/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=2940b0047ddb0630c3c0ece0e853b5ed7bcd680a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:49:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:16.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:49:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:16 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:49:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:16.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:49:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:17 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:17 np0005532763 python3.9[116271]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:17 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80000010b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:17 np0005532763 python3.9[116395]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930956.6668422-159-277419147432275/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=837e8dcdbcb3ca01e6b5360b86e6942411e1cc1f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:18 np0005532763 python3.9[116572]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:49:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:18.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:49:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:18 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:49:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:18.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:49:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:18 : epoch 6923726b : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:49:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:18 : epoch 6923726b : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:49:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:19 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:19 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:19 np0005532763 python3.9[116696]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930958.0580275-159-82035725707636/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=6474ab278ec9949fa1270d5330e74c4a7dc84e9c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:49:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:20 np0005532763 python3.9[116849]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:49:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:49:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:20.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:49:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:20 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:20.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:21 np0005532763 python3.9[117002]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:49:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:21 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:21 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:21 : epoch 6923726b : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:49:22 np0005532763 python3.9[117155]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:22.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:22 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:22.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:22 np0005532763 python3.9[117279]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930961.4836364-358-229544076161561/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=c84345861fea7d79e89911576a91e194a177572b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:23 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:23 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:23 np0005532763 python3.9[117432]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:24 np0005532763 python3.9[117555]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930963.0371773-358-23961530426571/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=26cfebde0335fa79ed2e9639d0ee86f73b64ddb4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:24.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:49:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:24 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:49:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:24.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:49:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:25 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:25 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:25 np0005532763 python3.9[117708]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:25 np0005532763 python3.9[117832]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930964.7476442-358-200771825181271/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=33c5ff2b8413a0f8b093419c5b44573f7d02af0c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:49:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:26.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:49:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:26 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:26.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:26 np0005532763 python3.9[117985]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:49:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:27 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:27 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:27 np0005532763 python3.9[118137]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:49:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:28 np0005532763 python3.9[118290]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:49:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:28.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:49:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/204928 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:49:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:28 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002ab0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:49:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:28.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:49:29 np0005532763 python3.9[118414]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930967.8535984-565-21398735700467/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=b90540da2321ded0af9e8e012b90df713825849b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:29 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:29 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:49:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:29 np0005532763 python3.9[118567]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:30 np0005532763 python3.9[118690]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930969.312889-565-46027871055554/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=26cfebde0335fa79ed2e9639d0ee86f73b64ddb4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:49:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:30.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:49:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:30 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:49:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:30.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:49:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002ab0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:31 np0005532763 python3.9[118843]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:31 np0005532763 python3.9[118967]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930970.692591-565-69291482525474/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=1a25c56c6f271be4f643f0d029ae67a11a5cb779 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:32.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:32 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:49:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:32.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:49:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:33 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:33 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002ab0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:33 np0005532763 python3.9[119120]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:49:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:34 np0005532763 python3.9[119273]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:49:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:34.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:34 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:34 np0005532763 python3.9[119397]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930973.5701036-790-112008996976445/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=848940549ac5db80ec615963c7c09743939a62fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:34.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:35 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:35 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:35 np0005532763 python3.9[119549]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:49:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:36 np0005532763 python3.9[119702]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:49:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:36.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:49:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:36 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002ab0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:49:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:36.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:49:37 np0005532763 python3.9[119826]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930975.8292475-872-130665989864734/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=848940549ac5db80ec615963c7c09743939a62fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:37 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:37 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:37 np0005532763 python3.9[119979]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:49:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:49:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:38.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:49:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:38 np0005532763 python3.9[120156]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:38 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:38.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:39 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:39 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:39 np0005532763 python3.9[120280]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930978.1339893-950-135044826916447/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=848940549ac5db80ec615963c7c09743939a62fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:49:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:40 np0005532763 python3.9[120433]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:49:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:49:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:40.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:49:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:40 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:40.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:41 np0005532763 python3.9[120586]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:41 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff000bb60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:41 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002af0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:41 np0005532763 python3.9[120710]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930980.529239-1031-142289795760338/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=848940549ac5db80ec615963c7c09743939a62fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:49:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:42.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:49:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:42 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:42 np0005532763 python3.9[120863]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:49:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:42.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:43 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:43 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024008e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:43 np0005532763 python3.9[121016]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:44 np0005532763 python3.9[121140]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930983.0633836-1079-149955919670139/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=848940549ac5db80ec615963c7c09743939a62fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:49:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:49:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:44.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:49:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:44.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:45 np0005532763 python3.9[121294]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:49:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:45 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:45 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:46 np0005532763 python3.9[121447]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:49:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:46.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:49:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:46 np0005532763 python3.9[121570]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763930985.4441242-1103-33085888463173/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=848940549ac5db80ec615963c7c09743939a62fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:46 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024008e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:46.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:47 np0005532763 systemd[1]: session-47.scope: Deactivated successfully.
Nov 23 15:49:47 np0005532763 systemd[1]: session-47.scope: Consumed 29.140s CPU time.
Nov 23 15:49:47 np0005532763 systemd-logind[830]: Session 47 logged out. Waiting for processes to exit.
Nov 23 15:49:47 np0005532763 systemd-logind[830]: Removed session 47.
Nov 23 15:49:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:47 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:47 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:48.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:48 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:48.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:49 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:49 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:49:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:49:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:50.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:49:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/204950 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:49:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:50 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:50.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:51 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:51 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:52.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:52 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:52.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:52 np0005532763 systemd-logind[830]: New session 48 of user zuul.
Nov 23 15:49:52 np0005532763 systemd[1]: Started Session 48 of User zuul.
Nov 23 15:49:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:53 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:53 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:53 np0005532763 python3.9[121758]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:49:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:49:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:54.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:49:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:54 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024008e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:54 np0005532763 python3.9[121911]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:54.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:55 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024008e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:55 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:55 np0005532763 python3.9[122034]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763930994.118431-64-246744913510453/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=756e8313f47ae598921d0392828cdc60f53012e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:56 np0005532763 python3.9[122187]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:49:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:49:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:56.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:49:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:56 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:56.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:56 np0005532763 python3.9[122311]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763930995.735937-64-208918241466329/.source.conf _original_basename=ceph.conf follow=False checksum=d92b20e9a86369ec384ba170ca716bfc5aeaba51 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:49:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:57 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024008e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:57 np0005532763 systemd[1]: session-48.scope: Deactivated successfully.
Nov 23 15:49:57 np0005532763 systemd[1]: session-48.scope: Consumed 3.318s CPU time.
Nov 23 15:49:57 np0005532763 systemd-logind[830]: Session 48 logged out. Waiting for processes to exit.
Nov 23 15:49:57 np0005532763 systemd-logind[830]: Removed session 48.
Nov 23 15:49:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:57 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:49:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:49:58.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:49:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:58 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:49:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:49:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:49:58.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:49:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:59 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:59 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024008e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:49:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:49:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:49:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:49:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:49:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:49:59 : epoch 6923726b : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:50:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:00.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:00 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:00 np0005532763 ceph-mon[75752]: overall HEALTH_OK
Nov 23 15:50:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:00.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:01 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:01 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:02 np0005532763 systemd-logind[830]: New session 49 of user zuul.
Nov 23 15:50:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:02.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:02 np0005532763 systemd[1]: Started Session 49 of User zuul.
Nov 23 15:50:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:02 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024008e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000057s ======
Nov 23 15:50:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:02.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Nov 23 15:50:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:02 : epoch 6923726b : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:50:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:02 : epoch 6923726b : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:50:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:03 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:03 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:03 np0005532763 python3.9[122520]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:50:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:50:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:50:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:04.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:50:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:04 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024008e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:04.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:05 np0005532763 python3.9[122678]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:50:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:05 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:05 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020002c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:05 np0005532763 python3.9[122830]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:50:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:05 : epoch 6923726b : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:50:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:06 np0005532763 python3.9[122981]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:50:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:06.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:06 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:06.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:07 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024008e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:07 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024008e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:07 np0005532763 python3.9[123134]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 23 15:50:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:08.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:08 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024008e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:08.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:09 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:09 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:09 np0005532763 dbus-broker-launch[812]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 23 15:50:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:50:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:10 np0005532763 python3.9[123374]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:50:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:10.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:50:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:50:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:50:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:50:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:50:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:50:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:50:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:50:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:10 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000004a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:50:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:10.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:50:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:11 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024008e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:11 np0005532763 python3.9[123459]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:50:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:11 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:50:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:12.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:50:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205012 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:50:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:12 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:12.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:13 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000004a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:13 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8024008e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:13 np0005532763 python3.9[123614]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 15:50:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:50:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:50:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:14.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:50:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:14 np0005532763 python3[123771]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 23 15:50:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:14 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:14.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:15 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:15 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff0002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:15 np0005532763 python3.9[123924]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:16 np0005532763 python3.9[124077]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:50:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:16.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:16 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000004a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:16.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:17 np0005532763 python3.9[124182]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:17 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:17 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:50:17 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:50:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:17 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:18 np0005532763 python3.9[124335]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:50:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:18 np0005532763 python3.9[124438]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.fvr459gu recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:18.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:18 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:18.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:19 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000004a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:19 np0005532763 python3.9[124591]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:50:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:19 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:50:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:19 np0005532763 python3.9[124670]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:20.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:20 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:20 np0005532763 python3.9[124823]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:50:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:20.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:21 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:21 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000004a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:21 np0005532763 python3[124976]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 23 15:50:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:22 np0005532763 python3.9[125129]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:50:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:50:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:22.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:50:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:22 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:22.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:23 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:23 np0005532763 python3.9[125255]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931021.9830425-433-97616603721604/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:23 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:24 np0005532763 python3.9[125408]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:50:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:50:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:50:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:24.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:50:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:24 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000004a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:24.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:25 np0005532763 python3.9[125534]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931023.6061409-478-173276793586457/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:25 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:25 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:25 np0005532763 python3.9[125687]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:50:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:26 np0005532763 python3.9[125812]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931025.2619278-524-209537031036413/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:26.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:26 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:26.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:27 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000004a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:27 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000004a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:27 np0005532763 python3.9[125965]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:50:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:28 np0005532763 python3.9[126091]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931026.8342948-569-178195078014748/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:28.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:28 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:28.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:29 np0005532763 python3.9[126244]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:50:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:29 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:29 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000004a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:50:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:29 np0005532763 python3.9[126369]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931028.3635035-614-250970145928424/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:50:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:30.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:50:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:30 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:30 np0005532763 python3.9[126523]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:30.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:31 np0005532763 python3.9[126675]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:50:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:32 np0005532763 python3.9[126831]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:32.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:32 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000004a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:32.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:33 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:33 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:33 np0005532763 python3.9[126984]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:50:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:34 np0005532763 python3.9[127140]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:50:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:50:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:34.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:34 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:34.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:35 np0005532763 python3.9[127295]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:50:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:35 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018001c40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:35 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:36 np0005532763 python3.9[127451]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:36.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:36 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:50:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:36.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:50:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:37 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:37 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018001c40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:37 np0005532763 python3.9[127602]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:50:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:38.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:38 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:38.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:38 np0005532763 python3.9[127782]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:93:45:69:49" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:50:38 np0005532763 ovs-vsctl[127783]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:93:45:69:49 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 23 15:50:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:39 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:39 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:50:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:40 np0005532763 python3.9[127936]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:50:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:40.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:40 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018001c40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:50:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:40.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:50:41 np0005532763 python3.9[128092]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:50:41 np0005532763 ovs-vsctl[128093]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 23 15:50:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:41 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:41 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:41 np0005532763 python3.9[128244]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:50:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:42.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:42 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:42.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:42 np0005532763 python3.9[128399]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:50:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:43 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018001c40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:43 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:43 np0005532763 python3.9[128552]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:50:44 np0005532763 python3.9[128630]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:50:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:50:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:44.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:44 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:44.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:45 np0005532763 python3.9[128783]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:50:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:45 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:45 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:45 np0005532763 python3.9[128861]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:50:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:46 np0005532763 python3.9[129014]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:46.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:46 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:46.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:47 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:47 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:47 np0005532763 python3.9[129167]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:50:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:48 np0005532763 python3.9[129246]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:48.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:48 np0005532763 python3.9[129399]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:50:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:48 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:50:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:48.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:50:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:49 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:49 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:49 np0005532763 python3.9[129478]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:50:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:50 np0005532763 python3.9[129631]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:50:50 np0005532763 systemd[1]: Reloading.
Nov 23 15:50:50 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:50:50 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:50:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:50.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:50 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:50.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:51 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:51 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:51 np0005532763 python3.9[129821]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:50:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:52 np0005532763 python3.9[129900]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:52.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:52 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:52.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:53 np0005532763 python3.9[130053]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:50:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:53 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4001de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:53 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:53 np0005532763 python3.9[130131]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:50:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:54 np0005532763 python3.9[130284]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:50:54 np0005532763 systemd[1]: Reloading.
Nov 23 15:50:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:50:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:54 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:50:54 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:50:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:54.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:54 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:54.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:54 np0005532763 systemd[1]: Starting Create netns directory...
Nov 23 15:50:55 np0005532763 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 15:50:55 np0005532763 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 15:50:55 np0005532763 systemd[1]: Finished Create netns directory.
Nov 23 15:50:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:55 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:55 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004900 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:56 np0005532763 python3.9[130478]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:50:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:56.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:56 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:56 np0005532763 python3.9[130631]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:50:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:50:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:56.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:50:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:57 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:57 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:57 np0005532763 python3.9[130754]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763931056.3119898-1367-55394322067661/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:50:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:58 np0005532763 python3.9[130907]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:50:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:50:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:50:58.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:50:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:58 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004900 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:50:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:50:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:50:58.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:50:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:59 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018004920 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:50:59 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:50:59 np0005532763 python3.9[131085]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:50:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:50:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:50:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:50:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:50:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:00 np0005532763 python3.9[131209]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763931058.8928363-1441-220211178353414/.source.json _original_basename=.1jdstjwd follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:51:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:51:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:00.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:51:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:00 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:00.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:01 np0005532763 python3.9[131362]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:51:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:01 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004900 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:01 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018004920 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:02.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:02 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000057s ======
Nov 23 15:51:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:02.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Nov 23 15:51:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:03 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020004fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:03 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004900 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:03 np0005532763 python3.9[131791]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 23 15:51:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:04 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 15:51:04 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2489 writes, 14K keys, 2489 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s#012Cumulative WAL: 2489 writes, 2489 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2489 writes, 14K keys, 2489 commit groups, 1.0 writes per commit group, ingest: 38.84 MB, 0.06 MB/s#012Interval WAL: 2489 writes, 2489 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    153.9      0.14              0.08         6    0.023       0      0       0.0       0.0#012  L6      1/0   12.81 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   2.9    194.8    171.3      0.37              0.23         5    0.074     21K   2263       0.0       0.0#012 Sum      1/0   12.81 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.9    141.4    166.6      0.51              0.31        11    0.046     21K   2263       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.9    142.1    167.4      0.51              0.31        10    0.051     21K   2263       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   0.0    194.8    171.3      0.37              0.23         5    0.074     21K   2263       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    156.6      0.14              0.08         5    0.027       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.021, interval 0.021#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.5 seconds#012Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e7d0d09350#2 capacity: 304.00 MB usage: 2.26 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 7.1e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(156,2.05 MB,0.675934%) FilterBlock(11,69.42 KB,0.0223009%) IndexBlock(11,138.52 KB,0.0444964%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Nov 23 15:51:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:51:04 np0005532763 python3.9[131944]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 15:51:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:51:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:04.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:51:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:04 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018004920 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:04.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:05 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:05 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:05 np0005532763 python3.9[132099]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 23 15:51:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:51:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:06.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:51:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:06 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004900 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:51:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:06.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:51:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:07 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:07 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:07 np0005532763 python3[132280]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 15:51:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:08.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:08 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:08.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:09 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004920 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:09 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004920 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:51:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:10.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:10 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:10.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:11 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:11 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004920 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:12.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:12 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:12 np0005532763 podman[132294]: 2025-11-23 20:51:12.892653897 +0000 UTC m=+4.824322436 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e
Nov 23 15:51:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:12.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:13 np0005532763 podman[132418]: 2025-11-23 20:51:13.13974802 +0000 UTC m=+0.075955833 container create 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 23 15:51:13 np0005532763 podman[132418]: 2025-11-23 20:51:13.104113636 +0000 UTC m=+0.040321499 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e
Nov 23 15:51:13 np0005532763 python3[132280]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e
Nov 23 15:51:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:13 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:13 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:51:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:14.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:14 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004920 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:51:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:14.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:51:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:15 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:15 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:51:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:16.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:51:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:16 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:16.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:17 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004920 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:17 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:18.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:18 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:18.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:19 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:19 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004920 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:51:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:20 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:51:20 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:51:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:20.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:20 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:51:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:20.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:51:21 np0005532763 python3.9[132795]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:51:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:21 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:21 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:21 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:51:22 np0005532763 python3.9[132950]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:51:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:22 np0005532763 python3.9[133027]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:51:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:51:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:22.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:51:22 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:51:22 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:51:22 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:51:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:22 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:51:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:22.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:51:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:23 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:23 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:23 np0005532763 python3.9[133178]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763931082.8449793-1705-94550601508493/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:51:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:24 np0005532763 python3.9[133255]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 15:51:24 np0005532763 systemd[1]: Reloading.
Nov 23 15:51:24 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:51:24 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:51:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:51:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:24.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:24 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:24.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:25 np0005532763 python3.9[133368]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:51:25 np0005532763 systemd[1]: Reloading.
Nov 23 15:51:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:25 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:25 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004920 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:25 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:51:25 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:51:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:25 np0005532763 systemd[1]: Starting ovn_controller container...
Nov 23 15:51:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:25 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:51:25 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32ea9c192280167115bbaab410274557c4bddccc5f44cd6ca59d764b6304097c/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 23 15:51:25 np0005532763 systemd[1]: Started /usr/bin/podman healthcheck run 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745.
Nov 23 15:51:25 np0005532763 podman[133410]: 2025-11-23 20:51:25.889182684 +0000 UTC m=+0.179486541 container init 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 23 15:51:25 np0005532763 ovn_controller[133425]: + sudo -E kolla_set_configs
Nov 23 15:51:25 np0005532763 podman[133410]: 2025-11-23 20:51:25.920735162 +0000 UTC m=+0.211038949 container start 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 23 15:51:25 np0005532763 edpm-start-podman-container[133410]: ovn_controller
Nov 23 15:51:25 np0005532763 systemd[1]: Created slice User Slice of UID 0.
Nov 23 15:51:25 np0005532763 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 23 15:51:26 np0005532763 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 23 15:51:26 np0005532763 edpm-start-podman-container[133409]: Creating additional drop-in dependency for "ovn_controller" (75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745)
Nov 23 15:51:26 np0005532763 systemd[1]: Starting User Manager for UID 0...
Nov 23 15:51:26 np0005532763 podman[133432]: 2025-11-23 20:51:26.030537237 +0000 UTC m=+0.093669487 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 15:51:26 np0005532763 systemd[1]: 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745-6d5b0a71f772f140.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 15:51:26 np0005532763 systemd[1]: 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745-6d5b0a71f772f140.service: Failed with result 'exit-code'.
Nov 23 15:51:26 np0005532763 systemd[1]: Reloading.
Nov 23 15:51:26 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:51:26 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:51:26 np0005532763 systemd[133467]: Queued start job for default target Main User Target.
Nov 23 15:51:26 np0005532763 systemd[133467]: Created slice User Application Slice.
Nov 23 15:51:26 np0005532763 systemd[133467]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 23 15:51:26 np0005532763 systemd[133467]: Started Daily Cleanup of User's Temporary Directories.
Nov 23 15:51:26 np0005532763 systemd[133467]: Reached target Paths.
Nov 23 15:51:26 np0005532763 systemd[133467]: Reached target Timers.
Nov 23 15:51:26 np0005532763 systemd[133467]: Starting D-Bus User Message Bus Socket...
Nov 23 15:51:26 np0005532763 systemd[133467]: Starting Create User's Volatile Files and Directories...
Nov 23 15:51:26 np0005532763 systemd[133467]: Listening on D-Bus User Message Bus Socket.
Nov 23 15:51:26 np0005532763 systemd[133467]: Reached target Sockets.
Nov 23 15:51:26 np0005532763 systemd[133467]: Finished Create User's Volatile Files and Directories.
Nov 23 15:51:26 np0005532763 systemd[133467]: Reached target Basic System.
Nov 23 15:51:26 np0005532763 systemd[133467]: Reached target Main User Target.
Nov 23 15:51:26 np0005532763 systemd[133467]: Startup finished in 196ms.
Nov 23 15:51:26 np0005532763 systemd[1]: Started User Manager for UID 0.
Nov 23 15:51:26 np0005532763 systemd[1]: Started ovn_controller container.
Nov 23 15:51:26 np0005532763 systemd[1]: Started Session c1 of User root.
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: INFO:__main__:Validating config file
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: INFO:__main__:Writing out command to execute
Nov 23 15:51:26 np0005532763 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: ++ cat /run_command
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: + ARGS=
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: + sudo kolla_copy_cacerts
Nov 23 15:51:26 np0005532763 systemd[1]: Started Session c2 of User root.
Nov 23 15:51:26 np0005532763 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: + [[ ! -n '' ]]
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: + . kolla_extend_start
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: + umask 0022
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 23 15:51:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 23 15:51:26 np0005532763 NetworkManager[48849]: <info>  [1763931086.5808] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Nov 23 15:51:26 np0005532763 NetworkManager[48849]: <info>  [1763931086.5820] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 15:51:26 np0005532763 NetworkManager[48849]: <info>  [1763931086.5838] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 23 15:51:26 np0005532763 NetworkManager[48849]: <info>  [1763931086.5848] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Nov 23 15:51:26 np0005532763 NetworkManager[48849]: <info>  [1763931086.5855] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 23 15:51:26 np0005532763 kernel: br-int: entered promiscuous mode
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00017|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00018|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00019|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00021|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00022|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00023|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00024|main|INFO|OVS feature set changed, force recompute.
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 23 15:51:26 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:26Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 23 15:51:26 np0005532763 NetworkManager[48849]: <info>  [1763931086.6101] manager: (ovn-6de892-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Nov 23 15:51:26 np0005532763 kernel: genev_sys_6081: entered promiscuous mode
Nov 23 15:51:26 np0005532763 NetworkManager[48849]: <info>  [1763931086.6379] device (genev_sys_6081): carrier: link connected
Nov 23 15:51:26 np0005532763 NetworkManager[48849]: <info>  [1763931086.6385] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Nov 23 15:51:26 np0005532763 systemd-udevd[133561]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 15:51:26 np0005532763 systemd-udevd[133565]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 15:51:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:26.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:26 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:26.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:27 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:27 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:27 np0005532763 NetworkManager[48849]: <info>  [1763931087.8390] manager: (ovn-d8ff4a-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Nov 23 15:51:28 np0005532763 NetworkManager[48849]: <info>  [1763931088.5476] manager: (ovn-fa015a-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Nov 23 15:51:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:28.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:28 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004920 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:51:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:29.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:51:29 np0005532763 python3.9[133698]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:51:29 np0005532763 ovs-vsctl[133699]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 23 15:51:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:29 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:29 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:51:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:30 np0005532763 python3.9[133852]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:51:30 np0005532763 ovs-vsctl[133854]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 23 15:51:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:51:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:30.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:51:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:30 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:51:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:31.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:51:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004920 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:31 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:31 np0005532763 python3.9[134008]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:51:31 np0005532763 ovs-vsctl[134009]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 23 15:51:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:31 np0005532763 systemd[1]: session-49.scope: Deactivated successfully.
Nov 23 15:51:31 np0005532763 systemd[1]: session-49.scope: Consumed 1min 9.285s CPU time.
Nov 23 15:51:31 np0005532763 systemd-logind[830]: Session 49 logged out. Waiting for processes to exit.
Nov 23 15:51:31 np0005532763 systemd-logind[830]: Removed session 49.
Nov 23 15:51:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:32.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:32 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:33.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:33 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ffc001f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:33 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004920 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:51:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:34.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:34 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:51:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:35.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:51:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:35 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7fe8004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:35 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8020003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:36 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:36Z|00025|memory|INFO|16384 kB peak resident set size after 10.0 seconds
Nov 23 15:51:36 np0005532763 ovn_controller[133425]: 2025-11-23T20:51:36Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Nov 23 15:51:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:36 np0005532763 systemd[1]: Stopping User Manager for UID 0...
Nov 23 15:51:36 np0005532763 systemd[133467]: Activating special unit Exit the Session...
Nov 23 15:51:36 np0005532763 systemd[133467]: Stopped target Main User Target.
Nov 23 15:51:36 np0005532763 systemd[133467]: Stopped target Basic System.
Nov 23 15:51:36 np0005532763 systemd[133467]: Stopped target Paths.
Nov 23 15:51:36 np0005532763 systemd[133467]: Stopped target Sockets.
Nov 23 15:51:36 np0005532763 systemd[133467]: Stopped target Timers.
Nov 23 15:51:36 np0005532763 systemd[133467]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 23 15:51:36 np0005532763 systemd[133467]: Closed D-Bus User Message Bus Socket.
Nov 23 15:51:36 np0005532763 systemd[133467]: Stopped Create User's Volatile Files and Directories.
Nov 23 15:51:36 np0005532763 systemd[133467]: Removed slice User Application Slice.
Nov 23 15:51:36 np0005532763 systemd[133467]: Reached target Shutdown.
Nov 23 15:51:36 np0005532763 systemd[133467]: Finished Exit the Session.
Nov 23 15:51:36 np0005532763 systemd[133467]: Reached target Exit the Session.
Nov 23 15:51:36 np0005532763 systemd[1]: user@0.service: Deactivated successfully.
Nov 23 15:51:36 np0005532763 systemd[1]: Stopped User Manager for UID 0.
Nov 23 15:51:36 np0005532763 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 23 15:51:36 np0005532763 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 23 15:51:36 np0005532763 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 23 15:51:36 np0005532763 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 23 15:51:36 np0005532763 systemd[1]: Removed slice User Slice of UID 0.
Nov 23 15:51:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:36.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:36 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004920 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:51:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:37.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:51:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:37 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:37 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002560 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:38 np0005532763 systemd-logind[830]: New session 51 of user zuul.
Nov 23 15:51:38 np0005532763 systemd[1]: Started Session 51 of User zuul.
Nov 23 15:51:38 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:51:38 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:51:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:51:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:38.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:51:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:38 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002560 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:51:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:39.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:51:39 np0005532763 python3.9[134249]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:51:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:39 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7ff4004ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:39 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 15:51:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5585 writes, 24K keys, 5585 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5585 writes, 902 syncs, 6.19 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5585 writes, 24K keys, 5585 commit groups, 1.0 writes per commit group, ingest: 19.00 MB, 0.03 MB/s#012Interval WAL: 5585 writes, 902 syncs, 6.19 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557d78bc9350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557d78bc9350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowd
Nov 23 15:51:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:51:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:40 np0005532763 python3.9[134407]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:51:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:40.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:40 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:51:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:51:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:41.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:51:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[93861]: 23/11/2025 20:51:41 : epoch 6923726b : compute-2 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8000003040 fd 39 proxy ignored for local
Nov 23 15:51:41 np0005532763 kernel: ganesha.nfsd[131970]: segfault at 50 ip 00007f80d113532e sp 00007f8092ffc210 error 4 in libntirpc.so.5.8[7f80d111a000+2c000] likely on CPU 5 (core 0, socket 5)
Nov 23 15:51:41 np0005532763 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Nov 23 15:51:41 np0005532763 systemd[1]: Started Process Core Dump (PID 134560/UID 0).
Nov 23 15:51:41 np0005532763 python3.9[134559]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:51:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:42 np0005532763 python3.9[134714]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:51:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:42 np0005532763 systemd-coredump[134561]: Process 93865 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 76:#012#0  0x00007f80d113532e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Nov 23 15:51:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:42.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:42 np0005532763 systemd[1]: systemd-coredump@3-134560-0.service: Deactivated successfully.
Nov 23 15:51:42 np0005532763 systemd[1]: systemd-coredump@3-134560-0.service: Consumed 1.276s CPU time.
Nov 23 15:51:42 np0005532763 podman[134843]: 2025-11-23 20:51:42.893369979 +0000 UTC m=+0.042193223 container died e28fbc6ba0fdd4406bf38781cfbbcee6b5f6bf3434626964fa14eb222d1569b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Nov 23 15:51:42 np0005532763 systemd[1]: var-lib-containers-storage-overlay-299620c935f64a202c2a323e336566fb64411149883365fed255920042bb21e5-merged.mount: Deactivated successfully.
Nov 23 15:51:42 np0005532763 podman[134843]: 2025-11-23 20:51:42.947696445 +0000 UTC m=+0.096519669 container remove e28fbc6ba0fdd4406bf38781cfbbcee6b5f6bf3434626964fa14eb222d1569b3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:51:42 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Main process exited, code=exited, status=139/n/a
Nov 23 15:51:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:43.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:43 np0005532763 python3.9[134885]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:51:43 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Failed with result 'exit-code'.
Nov 23 15:51:43 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 2.356s CPU time.
Nov 23 15:51:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:43 np0005532763 python3.9[135066]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:51:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:51:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:44.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:44 np0005532763 python3.9[135218]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:51:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:51:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:45.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:51:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:46 np0005532763 python3.9[135375]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 23 15:51:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:46.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:51:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:47.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:51:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205147 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:51:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:47 np0005532763 python3.9[135527]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:51:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:48 np0005532763 python3.9[135648]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763931107.2902792-221-2923452938635/.source follow=False _original_basename=haproxy.j2 checksum=deae64da24ad28f71dc47276f2e9f268f19a4519 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:51:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:51:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:48.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:51:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:51:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:49.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:51:49 np0005532763 python3.9[135799]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:51:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:51:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:50 np0005532763 python3.9[135921]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763931108.8490512-265-181988844363509/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:51:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:50.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:51:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:51.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:51:51 np0005532763 python3.9[136074]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:51:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:52 np0005532763 python3.9[136159]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:51:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:52.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:53.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:53 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Scheduled restart job, restart counter is at 4.
Nov 23 15:51:53 np0005532763 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:51:53 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 2.356s CPU time.
Nov 23 15:51:53 np0005532763 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:51:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:53 np0005532763 podman[136268]: 2025-11-23 20:51:53.679178498 +0000 UTC m=+0.050745446 container create d458b31cba4757362810be60b3fae609e70bcece8679b24575ca18c618ca9ffd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:51:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:53 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/04fef7d934e3f34a3b25ae75ae573b75eb79f8aff9300f9c152e4b1ce08e5a1d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Nov 23 15:51:53 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/04fef7d934e3f34a3b25ae75ae573b75eb79f8aff9300f9c152e4b1ce08e5a1d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:51:53 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/04fef7d934e3f34a3b25ae75ae573b75eb79f8aff9300f9c152e4b1ce08e5a1d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:51:53 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/04fef7d934e3f34a3b25ae75ae573b75eb79f8aff9300f9c152e4b1ce08e5a1d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.dqbktw-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:51:53 np0005532763 podman[136268]: 2025-11-23 20:51:53.655653018 +0000 UTC m=+0.027219986 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:51:53 np0005532763 podman[136268]: 2025-11-23 20:51:53.756677444 +0000 UTC m=+0.128244482 container init d458b31cba4757362810be60b3fae609e70bcece8679b24575ca18c618ca9ffd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:51:53 np0005532763 podman[136268]: 2025-11-23 20:51:53.763376784 +0000 UTC m=+0.134943762 container start d458b31cba4757362810be60b3fae609e70bcece8679b24575ca18c618ca9ffd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 15:51:53 np0005532763 bash[136268]: d458b31cba4757362810be60b3fae609e70bcece8679b24575ca18c618ca9ffd
Nov 23 15:51:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:51:53 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Nov 23 15:51:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:51:53 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Nov 23 15:51:53 np0005532763 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:51:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:51:53 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Nov 23 15:51:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:51:53 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Nov 23 15:51:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:51:53 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Nov 23 15:51:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:51:53 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Nov 23 15:51:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:51:53 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Nov 23 15:51:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:51:53 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:51:54 np0005532763 python3.9[136418]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 15:51:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:51:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:54.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:51:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:55.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:51:55 np0005532763 python3.9[136572]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:51:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:56 np0005532763 python3.9[136694]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763931114.8279538-376-52905574607931/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:51:56 np0005532763 podman[136695]: 2025-11-23 20:51:56.224884092 +0000 UTC m=+0.108987883 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 15:51:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:56.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:56 np0005532763 python3.9[136871]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:51:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:51:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:57.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:51:57 np0005532763 python3.9[136992]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763931116.28234-376-125293725071106/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:51:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:58 np0005532763 python3.9[137144]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:51:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:51:58.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:51:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:51:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:51:59.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:51:59 np0005532763 python3.9[137290]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763931118.2906508-509-145551333589408/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:51:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:51:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:51:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:51:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:51:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:51:59 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:51:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:51:59 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:52:00 np0005532763 python3.9[137441]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:52:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:00.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:00 np0005532763 python3.9[137563]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763931119.7123692-509-32498842318602/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:52:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:01.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:01 np0005532763 python3.9[137713]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:52:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:02 np0005532763 python3.9[137868]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:52:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:02.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:03.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:03 np0005532763 python3.9[138021]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:52:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:04 np0005532763 python3.9[138100]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:52:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:52:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:04 np0005532763 python3.9[138253]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:52:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:04.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:52:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:05.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:52:05 np0005532763 python3.9[138331]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Nov 23 15:52:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:05 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:52:06 np0005532763 python3.9[138496]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:52:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:06.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:06 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc238000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:52:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:07.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:52:07 np0005532763 python3.9[138649]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:52:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:07 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220000da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:07 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc214000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:07 np0005532763 python3.9[138730]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:52:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:08 np0005532763 python3.9[138883]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:52:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:08.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:08 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc234001cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:09.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:09 np0005532763 python3.9[138962]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:52:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205209 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:52:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:09 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc228001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:09 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:52:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:10 np0005532763 python3.9[139115]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:52:10 np0005532763 systemd[1]: Reloading.
Nov 23 15:52:10 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:52:10 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:52:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:52:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:10.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:52:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:10 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:11.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:11 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2340027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:11 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc228001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:11 np0005532763 python3.9[139305]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:52:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:12 np0005532763 python3.9[139384]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:52:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:12.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:12 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:12 np0005532763 python3.9[139537]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:52:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:13.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:13 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220001ab0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:13 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2340027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:13 np0005532763 python3.9[139615]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:52:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:14 np0005532763 python3.9[139768]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:52:14 np0005532763 systemd[1]: Reloading.
Nov 23 15:52:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:14 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:52:14 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:52:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:52:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:14.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:14 np0005532763 systemd[1]: Starting Create netns directory...
Nov 23 15:52:14 np0005532763 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 15:52:14 np0005532763 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 15:52:14 np0005532763 systemd[1]: Finished Create netns directory.
Nov 23 15:52:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:14 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc228001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:15.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:15 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc214001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:15 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220002bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:15 np0005532763 python3.9[139964]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:52:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:16 np0005532763 python3.9[140116]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:52:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:16.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:16 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2340027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:17.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:17 np0005532763 python3.9[140240]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763931136.1359296-961-216375606615045/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:52:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:17 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc228001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:17 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc214001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:18 np0005532763 python3.9[140393]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:52:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:18.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:18 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220002bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:19.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:19 np0005532763 python3.9[140571]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:52:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:19 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2340027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:19 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2280031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:52:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:19 np0005532763 python3.9[140695]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763931138.707424-1036-130280007042670/.source.json _original_basename=.liqtre56 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:52:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:20 np0005532763 python3.9[140848]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:52:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:20.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:20 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc214001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:52:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:21.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:52:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:21 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220002bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:21 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2340027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:22.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:22 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2280031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:23.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:23 np0005532763 python3.9[141277]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 23 15:52:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:23 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2140032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:23 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220002bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:24 np0005532763 python3.9[141430]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 15:52:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:52:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:24.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:24 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2340027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:25.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:25 np0005532763 python3.9[141583]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 23 15:52:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:25 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2280031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:25 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc210000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:26.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:26 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220002bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:27.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:27 np0005532763 podman[141735]: 2025-11-23 20:52:27.200136747 +0000 UTC m=+0.132484324 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 15:52:27 np0005532763 python3[141781]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 15:52:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:27 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2340027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:27 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2280042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:28.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:28 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2100016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:29.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:29 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220002bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:29 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220002bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:52:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:30.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:30 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220002bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:31.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:31 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2100016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:31 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2340027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:32.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:32 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220002bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:33.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:33 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220002bb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:33 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2100016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:52:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:34.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:34 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2340027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:35.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:35 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2280042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:35 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2280042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:35 np0005532763 podman[141801]: 2025-11-23 20:52:35.661421928 +0000 UTC m=+8.155601538 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 23 15:52:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:35 np0005532763 podman[141936]: 2025-11-23 20:52:35.86234553 +0000 UTC m=+0.065851347 container create 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 23 15:52:35 np0005532763 podman[141936]: 2025-11-23 20:52:35.833721043 +0000 UTC m=+0.037226870 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 23 15:52:35 np0005532763 python3[141781]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 23 15:52:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:36.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:36 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc210002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:37.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:37 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2340027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:37 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2280042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:52:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:38.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:52:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:38 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2280042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:39.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:39 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc210002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:39 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc208000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:52:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:40.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:40 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc204000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:41.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:41 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220003cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205241 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:52:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:41 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc210002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:42.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:42 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2080016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:43.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:43 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2040016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:43 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc210002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:44 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:52:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:52:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Nov 23 15:52:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Nov 23 15:52:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Nov 23 15:52:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Nov 23 15:52:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:44.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Nov 23 15:52:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Nov 23 15:52:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Nov 23 15:52:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:44 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220003cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Nov 23 15:52:45 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:52:45 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:52:45 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:52:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:52:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:45.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:52:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:45 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2080016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:45 np0005532763 python3.9[142261]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:52:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:45 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2080016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:46 np0005532763 python3.9[142416]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:52:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:46.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:46 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc210003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:47 np0005532763 python3.9[142493]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:52:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:47.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:47 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220003cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:47 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2080016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:47 np0005532763 python3.9[142645]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763931167.1401641-1300-244475400807968/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:52:48 np0005532763 python3.9[142721]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 15:52:48 np0005532763 systemd[1]: Reloading.
Nov 23 15:52:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:48 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:52:48 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:52:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:52:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:48.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:52:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:48 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc204001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:49 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:52:49 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:52:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:49.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:49 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc204001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:49 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220003cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:49 np0005532763 python3.9[142858]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:52:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:52:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:49 np0005532763 systemd[1]: Reloading.
Nov 23 15:52:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:49 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:52:49 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:52:50 np0005532763 systemd[1]: Starting ovn_metadata_agent container...
Nov 23 15:52:50 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:52:50 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43363ab5001dca1ad6550130549c9bbe2388a81851c2deca2f4eff9691e4d14d/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 23 15:52:50 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43363ab5001dca1ad6550130549c9bbe2388a81851c2deca2f4eff9691e4d14d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 15:52:50 np0005532763 systemd[1]: Started /usr/bin/podman healthcheck run 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45.
Nov 23 15:52:50 np0005532763 podman[142900]: 2025-11-23 20:52:50.312849809 +0000 UTC m=+0.239225992 container init 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: + sudo -E kolla_set_configs
Nov 23 15:52:50 np0005532763 podman[142900]: 2025-11-23 20:52:50.350549592 +0000 UTC m=+0.276925685 container start 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 23 15:52:50 np0005532763 edpm-start-podman-container[142900]: ovn_metadata_agent
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: INFO:__main__:Validating config file
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: INFO:__main__:Copying service configuration files
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: INFO:__main__:Writing out command to execute
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: ++ cat /run_command
Nov 23 15:52:50 np0005532763 edpm-start-podman-container[142899]: Creating additional drop-in dependency for "ovn_metadata_agent" (096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45)
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: + CMD=neutron-ovn-metadata-agent
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: + ARGS=
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: + sudo kolla_copy_cacerts
Nov 23 15:52:50 np0005532763 systemd[1]: Reloading.
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: + [[ ! -n '' ]]
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: + . kolla_extend_start
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: Running command: 'neutron-ovn-metadata-agent'
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: + umask 0022
Nov 23 15:52:50 np0005532763 ovn_metadata_agent[142915]: + exec neutron-ovn-metadata-agent
Nov 23 15:52:50 np0005532763 podman[142922]: 2025-11-23 20:52:50.48036613 +0000 UTC m=+0.108903980 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 23 15:52:50 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:52:50 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:52:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:50 np0005532763 systemd[1]: Started ovn_metadata_agent container.
Nov 23 15:52:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:50.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:50 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220003cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:51.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:51 np0005532763 systemd[1]: session-51.scope: Deactivated successfully.
Nov 23 15:52:51 np0005532763 systemd[1]: session-51.scope: Consumed 1min 4.562s CPU time.
Nov 23 15:52:51 np0005532763 systemd-logind[830]: Session 51 logged out. Waiting for processes to exit.
Nov 23 15:52:51 np0005532763 systemd-logind[830]: Removed session 51.
Nov 23 15:52:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:51 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc204001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:51 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc210003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.164 142920 INFO neutron.common.config [-] Logging enabled!#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.164 142920 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.164 142920 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.165 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.165 142920 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.165 142920 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.165 142920 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.165 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.165 142920 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.166 142920 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.166 142920 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.166 142920 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.166 142920 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.166 142920 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.166 142920 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.166 142920 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.166 142920 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.167 142920 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.167 142920 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.167 142920 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.167 142920 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.167 142920 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.167 142920 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.167 142920 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.167 142920 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.167 142920 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.167 142920 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.168 142920 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.168 142920 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.168 142920 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.168 142920 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.168 142920 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.168 142920 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.168 142920 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.168 142920 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.168 142920 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.169 142920 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.169 142920 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.169 142920 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.169 142920 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.169 142920 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.169 142920 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.169 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.169 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.170 142920 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.170 142920 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.170 142920 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.170 142920 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.170 142920 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.170 142920 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.170 142920 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.170 142920 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.170 142920 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.170 142920 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.171 142920 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.171 142920 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.171 142920 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.171 142920 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.171 142920 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.171 142920 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.171 142920 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.171 142920 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.171 142920 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.171 142920 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.172 142920 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.172 142920 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.172 142920 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.172 142920 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.172 142920 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.172 142920 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.172 142920 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.172 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.172 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.173 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.173 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.173 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.173 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.173 142920 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.173 142920 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.173 142920 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.174 142920 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.174 142920 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.174 142920 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.174 142920 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.174 142920 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.174 142920 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.174 142920 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.174 142920 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.174 142920 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.174 142920 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.175 142920 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.175 142920 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.175 142920 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.175 142920 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.175 142920 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.175 142920 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.175 142920 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.175 142920 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.175 142920 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.176 142920 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.176 142920 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.176 142920 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.176 142920 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.176 142920 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.176 142920 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.176 142920 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.177 142920 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.177 142920 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.177 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.177 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.177 142920 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.177 142920 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.177 142920 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.178 142920 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.178 142920 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.178 142920 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.178 142920 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.178 142920 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.178 142920 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.178 142920 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.179 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.179 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.179 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.179 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.179 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.179 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.180 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.180 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.180 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.180 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.180 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.180 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.180 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.181 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.181 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.181 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.181 142920 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.181 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.181 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.181 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.181 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.182 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.182 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.182 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.182 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.182 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.182 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.182 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.182 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.182 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.182 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.183 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.183 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.183 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.183 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.183 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.183 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.183 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.183 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.183 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.184 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.184 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.184 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.184 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.184 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.184 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.184 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.184 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.184 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.185 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.185 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.185 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.185 142920 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.185 142920 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.185 142920 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.185 142920 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.186 142920 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.186 142920 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.186 142920 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.186 142920 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.186 142920 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.186 142920 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.186 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.186 142920 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.186 142920 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.187 142920 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.187 142920 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.187 142920 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.187 142920 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.187 142920 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.187 142920 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.187 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.187 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.187 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.187 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.188 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.188 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.188 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.188 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.188 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.188 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.188 142920 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.188 142920 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.188 142920 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.189 142920 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.189 142920 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.189 142920 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.189 142920 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.189 142920 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.189 142920 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.189 142920 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.189 142920 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.189 142920 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.190 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.190 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.190 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.190 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.190 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.190 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.190 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.190 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.190 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.190 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.191 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.191 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.191 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.191 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.191 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.191 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.191 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.191 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.191 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.192 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.192 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.192 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.192 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.192 142920 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.192 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.192 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.192 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.193 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.193 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.193 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.193 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.193 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.193 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.193 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.193 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.193 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.194 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.194 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.194 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.194 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.194 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.194 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.194 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.194 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.194 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.195 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.195 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.195 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.195 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.195 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.195 142920 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.195 142920 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.195 142920 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.195 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.196 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.196 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.196 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.196 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.196 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.196 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.196 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.196 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.196 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.197 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.197 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.197 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.197 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.197 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.197 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.197 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.197 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.198 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.198 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.198 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.198 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.198 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.198 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.198 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.198 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.199 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.199 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.199 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.199 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.199 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.199 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.199 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.200 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.200 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.200 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.200 142920 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.200 142920 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.209 142920 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.209 142920 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.209 142920 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.209 142920 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.210 142920 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.224 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 10e3bf57-dd2d-4b94-851f-925bcd297dde (UUID: 10e3bf57-dd2d-4b94-851f-925bcd297dde) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.245 142920 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.246 142920 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.246 142920 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.246 142920 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.248 142920 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.254 142920 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.258 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '10e3bf57-dd2d-4b94-851f-925bcd297dde'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], external_ids={}, name=10e3bf57-dd2d-4b94-851f-925bcd297dde, nb_cfg_timestamp=1763931094612, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.259 142920 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f7a37366f70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.260 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.260 142920 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.260 142920 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.260 142920 INFO oslo_service.service [-] Starting 1 workers#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.264 142920 DEBUG oslo_service.service [-] Started child 143028 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.268 142920 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpnc6ydjs9/privsep.sock']#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.268 143028 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-427999'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.289 143028 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.289 143028 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.290 143028 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.292 143028 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.298 143028 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.305 143028 INFO eventlet.wsgi.server [-] (143028) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Nov 23 15:52:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:52 np0005532763 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 23 15:52:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:52.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:52 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc208002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.978 142920 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.979 142920 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpnc6ydjs9/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.849 143034 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.857 143034 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.865 143034 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.866 143034 INFO oslo.privsep.daemon [-] privsep daemon running as pid 143034#033[00m
Nov 23 15:52:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:52.982 143034 DEBUG oslo.privsep.daemon [-] privsep: reply[0fef5d55-0696-49c7-ba9d-407a31a2a6f9]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 15:52:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:53.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:53 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220003cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:53 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:53.472 143034 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 15:52:53 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:53.472 143034 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 15:52:53 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:53.472 143034 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 15:52:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:53 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc204001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:53 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:53.998 143034 DEBUG oslo.privsep.daemon [-] privsep: reply[d76788a9-29d5-44ea-a1fe-091078c51a57]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.000 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=10e3bf57-dd2d-4b94-851f-925bcd297dde, column=external_ids, values=({'neutron:ovn-metadata-id': '5cc9c550-bded-5389-a3c8-1354e48e7d2c'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.009 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10e3bf57-dd2d-4b94-851f-925bcd297dde, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.023 142920 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.024 142920 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.024 142920 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.024 142920 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.024 142920 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.024 142920 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.024 142920 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.025 142920 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.025 142920 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.025 142920 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.025 142920 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.025 142920 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.025 142920 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.026 142920 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.026 142920 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.026 142920 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.026 142920 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.026 142920 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.026 142920 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.026 142920 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.027 142920 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.027 142920 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.027 142920 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.027 142920 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.027 142920 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.028 142920 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.028 142920 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.028 142920 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.028 142920 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.028 142920 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.028 142920 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.028 142920 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.028 142920 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.029 142920 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.029 142920 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.029 142920 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.029 142920 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.029 142920 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.030 142920 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.030 142920 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.030 142920 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.030 142920 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.030 142920 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.030 142920 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.030 142920 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.031 142920 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.031 142920 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.031 142920 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.031 142920 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.031 142920 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.031 142920 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.031 142920 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.031 142920 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.032 142920 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.032 142920 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.032 142920 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.032 142920 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.032 142920 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.032 142920 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.032 142920 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.033 142920 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.033 142920 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.033 142920 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.033 142920 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.033 142920 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.033 142920 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.033 142920 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.034 142920 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.034 142920 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.034 142920 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.034 142920 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.034 142920 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.034 142920 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.035 142920 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.035 142920 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.035 142920 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.035 142920 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.035 142920 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.035 142920 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.035 142920 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.036 142920 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.036 142920 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.036 142920 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.036 142920 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.036 142920 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.036 142920 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.036 142920 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.036 142920 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.037 142920 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.037 142920 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.037 142920 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.037 142920 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.037 142920 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.037 142920 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.037 142920 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.037 142920 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.038 142920 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.038 142920 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.038 142920 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.038 142920 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.038 142920 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.038 142920 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.038 142920 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.038 142920 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.039 142920 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.039 142920 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.039 142920 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.039 142920 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.039 142920 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.039 142920 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.039 142920 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.039 142920 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.040 142920 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.040 142920 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.040 142920 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.040 142920 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.040 142920 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.040 142920 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.040 142920 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.040 142920 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.040 142920 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.041 142920 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.041 142920 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.041 142920 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.041 142920 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.041 142920 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.041 142920 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.041 142920 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.041 142920 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.041 142920 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.042 142920 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.042 142920 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.042 142920 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.042 142920 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.042 142920 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.042 142920 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.042 142920 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.042 142920 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.043 142920 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.043 142920 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.043 142920 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.043 142920 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.043 142920 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.043 142920 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.043 142920 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.043 142920 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.043 142920 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.044 142920 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.044 142920 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.044 142920 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.044 142920 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.044 142920 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.044 142920 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.044 142920 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.044 142920 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.044 142920 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.045 142920 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.045 142920 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.045 142920 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.045 142920 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.045 142920 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.045 142920 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.045 142920 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.045 142920 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.045 142920 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.045 142920 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.045 142920 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.046 142920 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.046 142920 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.046 142920 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.046 142920 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.046 142920 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.046 142920 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.046 142920 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.046 142920 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.046 142920 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.047 142920 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.047 142920 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.047 142920 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.047 142920 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.047 142920 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.047 142920 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.047 142920 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.047 142920 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.047 142920 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.048 142920 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.048 142920 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.048 142920 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.048 142920 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.048 142920 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.048 142920 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.048 142920 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.048 142920 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.048 142920 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.048 142920 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.049 142920 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.049 142920 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.049 142920 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.049 142920 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.049 142920 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.049 142920 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.049 142920 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.049 142920 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.049 142920 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.050 142920 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.050 142920 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.050 142920 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.050 142920 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.050 142920 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.050 142920 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.050 142920 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.050 142920 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.050 142920 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.050 142920 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.051 142920 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.051 142920 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.051 142920 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.051 142920 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.051 142920 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.051 142920 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.051 142920 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.051 142920 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.051 142920 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.051 142920 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.052 142920 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.052 142920 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.052 142920 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.052 142920 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.052 142920 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.052 142920 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.052 142920 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.052 142920 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.052 142920 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.052 142920 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.053 142920 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.053 142920 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.053 142920 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.053 142920 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.053 142920 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.053 142920 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.053 142920 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.053 142920 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.053 142920 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.053 142920 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.054 142920 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.054 142920 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.054 142920 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.054 142920 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.054 142920 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.054 142920 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.054 142920 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.054 142920 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.054 142920 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.055 142920 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.055 142920 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.055 142920 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.055 142920 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.055 142920 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.055 142920 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.055 142920 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.055 142920 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.055 142920 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.055 142920 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.056 142920 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.056 142920 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.056 142920 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.056 142920 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.056 142920 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.056 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.056 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.056 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.056 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.057 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.057 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.057 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.057 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.057 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.057 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.057 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.057 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.058 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.058 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.058 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.058 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.058 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.058 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.058 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.059 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.059 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.059 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.059 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.059 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.059 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.059 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.059 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.059 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.060 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.060 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.060 142920 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.060 142920 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.060 142920 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.060 142920 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.060 142920 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 15:52:54 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:52:54.060 142920 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 23 15:52:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:52:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:54.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:54 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc210003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:55.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:55 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc208002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:55 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220003cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:56 np0005532763 systemd-logind[830]: New session 52 of user zuul.
Nov 23 15:52:56 np0005532763 systemd[1]: Started Session 52 of User zuul.
Nov 23 15:52:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:56.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:56 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2040036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:57.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:57 np0005532763 python3.9[143196]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:52:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:57 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc210003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:57 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc208003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:57 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:52:58 np0005532763 podman[143226]: 2025-11-23 20:52:58.280314752 +0000 UTC m=+0.161801250 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 15:52:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:52:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:52:58.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:52:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:58 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:59 np0005532763 python3.9[143381]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:52:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:52:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:52:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:52:59.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:52:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:59 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc2040036e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:52:59 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc210003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:52:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:52:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:52:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:52:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:52:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:00 np0005532763 python3.9[143572]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 15:53:00 np0005532763 systemd[1]: Reloading.
Nov 23 15:53:00 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:53:00 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:53:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:53:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:00.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:53:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:53:00 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:53:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:53:00 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:53:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:53:00 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc208003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:01.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[136301]: 23/11/2025 20:53:01 : epoch 692373e9 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc220003cb0 fd 48 proxy ignored for local
Nov 23 15:53:01 np0005532763 kernel: ganesha.nfsd[138478]: segfault at 50 ip 00007fc2e806d32e sp 00007fc2b0ff8210 error 4 in libntirpc.so.5.8[7fc2e8052000+2c000] likely on CPU 6 (core 0, socket 6)
Nov 23 15:53:01 np0005532763 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Nov 23 15:53:01 np0005532763 systemd[1]: Started Process Core Dump (PID 143730/UID 0).
Nov 23 15:53:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:01 np0005532763 python3.9[143762]: ansible-ansible.builtin.service_facts Invoked
Nov 23 15:53:02 np0005532763 systemd-coredump[143734]: Process 136305 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 46:#012#0  0x00007fc2e806d32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Nov 23 15:53:02 np0005532763 systemd[1]: systemd-coredump@4-143730-0.service: Deactivated successfully.
Nov 23 15:53:02 np0005532763 systemd[1]: systemd-coredump@4-143730-0.service: Consumed 1.103s CPU time.
Nov 23 15:53:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:02 np0005532763 podman[143770]: 2025-11-23 20:53:02.737182162 +0000 UTC m=+0.028626417 container died d458b31cba4757362810be60b3fae609e70bcece8679b24575ca18c618ca9ffd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Nov 23 15:53:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:02 np0005532763 systemd[1]: var-lib-containers-storage-overlay-04fef7d934e3f34a3b25ae75ae573b75eb79f8aff9300f9c152e4b1ce08e5a1d-merged.mount: Deactivated successfully.
Nov 23 15:53:02 np0005532763 podman[143770]: 2025-11-23 20:53:02.783915289 +0000 UTC m=+0.075359484 container remove d458b31cba4757362810be60b3fae609e70bcece8679b24575ca18c618ca9ffd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Nov 23 15:53:02 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Main process exited, code=exited, status=139/n/a
Nov 23 15:53:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:02.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:02 np0005532763 network[143820]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 15:53:02 np0005532763 network[143823]: 'network-scripts' will be removed from distribution in near future.
Nov 23 15:53:02 np0005532763 network[143826]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 15:53:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205302 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:53:03 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Failed with result 'exit-code'.
Nov 23 15:53:03 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.708s CPU time.
Nov 23 15:53:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:03.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:53:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:04.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:05.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:06.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:07.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205307 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:53:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [NOTICE] 326/205307 (4) : haproxy version is 2.3.17-d1c9119
Nov 23 15:53:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [NOTICE] 326/205307 (4) : path to executable is /usr/local/sbin/haproxy
Nov 23 15:53:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [ALERT] 326/205307 (4) : backend 'backend' has no server available!
Nov 23 15:53:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:08.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:53:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:09.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:53:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:53:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:09 np0005532763 python3.9[144098]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:53:10 np0005532763 python3.9[144252]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:53:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:10.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:11.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:12 np0005532763 python3.9[144407]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:53:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:12.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:13 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Scheduled restart job, restart counter is at 5.
Nov 23 15:53:13 np0005532763 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:53:13 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.708s CPU time.
Nov 23 15:53:13 np0005532763 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:53:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:13.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:13 np0005532763 podman[144606]: 2025-11-23 20:53:13.474693391 +0000 UTC m=+0.062164294 container create 0237a2fcc484aa33d984ab25ecadb743b60916b41b2a4c68da4c20cf64de66e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Nov 23 15:53:13 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/801cb7d538f9f3b8c57a9406c77e88a853b9fa9be0f38a01ac9811717faf4d1f/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Nov 23 15:53:13 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/801cb7d538f9f3b8c57a9406c77e88a853b9fa9be0f38a01ac9811717faf4d1f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:53:13 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/801cb7d538f9f3b8c57a9406c77e88a853b9fa9be0f38a01ac9811717faf4d1f/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:53:13 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/801cb7d538f9f3b8c57a9406c77e88a853b9fa9be0f38a01ac9811717faf4d1f/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.dqbktw-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:53:13 np0005532763 podman[144606]: 2025-11-23 20:53:13.445643987 +0000 UTC m=+0.033114940 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:53:13 np0005532763 podman[144606]: 2025-11-23 20:53:13.5577025 +0000 UTC m=+0.145173443 container init 0237a2fcc484aa33d984ab25ecadb743b60916b41b2a4c68da4c20cf64de66e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:53:13 np0005532763 podman[144606]: 2025-11-23 20:53:13.568493242 +0000 UTC m=+0.155964145 container start 0237a2fcc484aa33d984ab25ecadb743b60916b41b2a4c68da4c20cf64de66e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:53:13 np0005532763 bash[144606]: 0237a2fcc484aa33d984ab25ecadb743b60916b41b2a4c68da4c20cf64de66e2
Nov 23 15:53:13 np0005532763 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:53:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:13 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Nov 23 15:53:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:13 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Nov 23 15:53:13 np0005532763 python3.9[144590]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:53:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:13 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Nov 23 15:53:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:13 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Nov 23 15:53:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:13 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Nov 23 15:53:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:13 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Nov 23 15:53:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:13 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Nov 23 15:53:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:13 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:53:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:14 np0005532763 python3.9[144816]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:53:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:53:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:14.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:15.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:15 np0005532763 python3.9[144970]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:53:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:16 np0005532763 python3.9[145124]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:53:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:16.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:17.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205317 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:53:17 np0005532763 python3.9[145278]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:53:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:18 np0005532763 python3.9[145431]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:53:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:18.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:19 np0005532763 python3.9[145584]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:53:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:19.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:53:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:19 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Nov 23 15:53:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:19 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Nov 23 15:53:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:19 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:53:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:19 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:53:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:19 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 15:53:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:19 np0005532763 python3.9[145761]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:53:20 np0005532763 python3.9[145914]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:53:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:20.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:21 np0005532763 podman[146039]: 2025-11-23 20:53:21.208653722 +0000 UTC m=+0.087128845 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 23 15:53:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:21.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:21 np0005532763 python3.9[146086]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:53:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:21 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:53:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:21 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:53:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:21 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:53:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:22 np0005532763 python3.9[146240]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:53:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:22.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:23 np0005532763 python3.9[146393]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:53:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:23.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:23 np0005532763 python3.9[146546]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:53:24 np0005532763 python3.9[146698]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:53:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:53:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000056s ======
Nov 23 15:53:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:24.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Nov 23 15:53:25 np0005532763 python3.9[146851]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:53:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:25.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:26 np0005532763 python3.9[147004]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:53:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:53:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:26.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:53:27 np0005532763 python3.9[147157]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:53:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:27.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-000000000000000f:nfs.cephfs.1: -2
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:27 : epoch 69237439 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:28 np0005532763 python3.9[147322]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:53:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:28 np0005532763 podman[147423]: 2025-11-23 20:53:28.907426496 +0000 UTC m=+0.156909342 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 23 15:53:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:28.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:29 np0005532763 python3.9[147501]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:53:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:29 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e78000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:29.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:29 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e600016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:29 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e50000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:53:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:30 np0005532763 python3.9[147658]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 23 15:53:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:30.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:31 np0005532763 python3.9[147811]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 15:53:31 np0005532763 systemd[1]: Reloading.
Nov 23 15:53:31 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:53:31 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:53:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:31 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e78000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:31.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205331 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:53:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:31 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e600016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:31 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e600016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:32 np0005532763 python3.9[147998]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:53:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:32.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:33 np0005532763 python3.9[148152]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:53:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:33 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e500016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:33.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:33 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e78000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:33 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e600016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:33 np0005532763 python3.9[148306]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:53:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:53:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:34.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:35 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e600016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:35.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:35 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e500016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:35 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e78000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:35 np0005532763 python3.9[148460]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:53:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:36 np0005532763 python3.9[148614]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:53:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:36.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:37 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e78000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:37.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:37 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e78000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:37 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e500016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:37 : epoch 69237439 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:53:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:38 np0005532763 python3.9[148769]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:53:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:38.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:39 np0005532763 python3.9[148923]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:53:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:39 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e78000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:39.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:39 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e78000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:39 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e60002fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:53:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:40 np0005532763 python3.9[149102]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 23 15:53:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:40 : epoch 69237439 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:53:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:40 : epoch 69237439 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:53:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:40 : epoch 69237439 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:53:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:40.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:41 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e50002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:41.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:41 np0005532763 python3.9[149256]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 23 15:53:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:41 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e78000df0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:41 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e54001800 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:42.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:43 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e60002fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:43.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:43 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e50002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:43 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e78009f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:43 np0005532763 python3.9[149416]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 23 15:53:43 np0005532763 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 15:53:43 np0005532763 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 15:53:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:43 : epoch 69237439 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:53:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:53:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:44.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:44 np0005532763 python3.9[149579]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:53:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:45 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e54002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:45.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:45 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e60002fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:45 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e50002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:45 np0005532763 python3.9[149663]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:53:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:46.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:47 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e78009f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:47.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:47 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e78009f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:47 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e60002fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:48.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:49 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e50003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205349 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:53:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:49.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:49 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e78009f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:49 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e54002c40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:53:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:49 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:53:49 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:53:49 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:53:49 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:53:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:50.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:51 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e60002fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:51.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:51 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e50003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:51 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e78009f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:53:52.202 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 15:53:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:53:52.203 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 15:53:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:53:52.203 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 15:53:52 np0005532763 podman[149767]: 2025-11-23 20:53:52.242897631 +0000 UTC m=+0.111649323 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 15:53:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:52.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:53 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e54002c40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:53.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:53 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e60002fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:53 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e50003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:53:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:54.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:55 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e78009f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:55.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:55 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e54002c40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205355 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:53:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:55 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e60002fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:55 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:53:55 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:53:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:56.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:57 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e50003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:57.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:57 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e78009f00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:57 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e54002c40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:53:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:53:58.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:53:59 np0005532763 podman[149935]: 2025-11-23 20:53:59.257741992 +0000 UTC m=+0.131300664 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 23 15:53:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:59 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e60002fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:53:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:53:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:53:59.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:53:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:59 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e60002fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:53:59 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e50003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:53:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:53:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:53:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:53:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:53:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:54:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:00.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:54:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:54:01 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e68000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:01.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:54:01 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e60002fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:54:01 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e48000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:02.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:54:03 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e50003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:03.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:54:03 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e680016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:54:03 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e60002fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:54:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:04.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:54:05 : epoch 69237439 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:54:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:54:05 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e480016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:05.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:05 np0005532763 kernel: ganesha.nfsd[150040]: segfault at 50 ip 00007f3f25f5432e sp 00007f3eeb7fd210 error 4 in libntirpc.so.5.8[7f3f25f39000+2c000] likely on CPU 6 (core 0, socket 6)
Nov 23 15:54:05 np0005532763 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Nov 23 15:54:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[144621]: 23/11/2025 20:54:05 : epoch 69237439 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f3e480016a0 fd 48 proxy ignored for local
Nov 23 15:54:05 np0005532763 systemd[1]: Started Process Core Dump (PID 150051/UID 0).
Nov 23 15:54:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:06 np0005532763 systemd-coredump[150052]: Process 144625 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 56:#012#0  0x00007f3f25f5432e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Nov 23 15:54:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:06 np0005532763 systemd[1]: systemd-coredump@5-150051-0.service: Deactivated successfully.
Nov 23 15:54:06 np0005532763 systemd[1]: systemd-coredump@5-150051-0.service: Consumed 1.284s CPU time.
Nov 23 15:54:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:06.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:06 np0005532763 podman[150063]: 2025-11-23 20:54:06.962193945 +0000 UTC m=+0.043510421 container died 0237a2fcc484aa33d984ab25ecadb743b60916b41b2a4c68da4c20cf64de66e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True)
Nov 23 15:54:06 np0005532763 systemd[1]: var-lib-containers-storage-overlay-801cb7d538f9f3b8c57a9406c77e88a853b9fa9be0f38a01ac9811717faf4d1f-merged.mount: Deactivated successfully.
Nov 23 15:54:07 np0005532763 podman[150063]: 2025-11-23 20:54:07.009106281 +0000 UTC m=+0.090422727 container remove 0237a2fcc484aa33d984ab25ecadb743b60916b41b2a4c68da4c20cf64de66e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:54:07 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Main process exited, code=exited, status=139/n/a
Nov 23 15:54:07 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Failed with result 'exit-code'.
Nov 23 15:54:07 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.868s CPU time.
Nov 23 15:54:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:07.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:08.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:09.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:54:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:10.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:11.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205411 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:54:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:12.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:13.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:54:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:14 np0005532763 kernel: SELinux:  Converting 2772 SID table entries...
Nov 23 15:54:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:14.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:14 np0005532763 kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 15:54:14 np0005532763 kernel: SELinux:  policy capability open_perms=1
Nov 23 15:54:14 np0005532763 kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 15:54:14 np0005532763 kernel: SELinux:  policy capability always_check_network=0
Nov 23 15:54:14 np0005532763 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 15:54:14 np0005532763 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 15:54:14 np0005532763 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 15:54:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:15.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000057s ======
Nov 23 15:54:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:16.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Nov 23 15:54:17 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Scheduled restart job, restart counter is at 6.
Nov 23 15:54:17 np0005532763 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:54:17 np0005532763 dbus-broker-launch[812]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Nov 23 15:54:17 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.868s CPU time.
Nov 23 15:54:17 np0005532763 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:54:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:17.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205417 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:54:17 np0005532763 podman[150173]: 2025-11-23 20:54:17.698289048 +0000 UTC m=+0.064345566 container create 98537d9d7e3710c878f7b57de18b6702a71d8bd53c83c4cf6b43c0860b2c7be5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:54:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:17 np0005532763 podman[150173]: 2025-11-23 20:54:17.667600707 +0000 UTC m=+0.033657275 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:54:17 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe7707439f430e738574a08b880db85dfac10b03d7261373f4cd4f3367589a41/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Nov 23 15:54:17 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe7707439f430e738574a08b880db85dfac10b03d7261373f4cd4f3367589a41/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:54:17 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe7707439f430e738574a08b880db85dfac10b03d7261373f4cd4f3367589a41/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:54:17 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe7707439f430e738574a08b880db85dfac10b03d7261373f4cd4f3367589a41/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.dqbktw-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:54:17 np0005532763 podman[150173]: 2025-11-23 20:54:17.799667947 +0000 UTC m=+0.165724465 container init 98537d9d7e3710c878f7b57de18b6702a71d8bd53c83c4cf6b43c0860b2c7be5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:54:17 np0005532763 podman[150173]: 2025-11-23 20:54:17.819744965 +0000 UTC m=+0.185801483 container start 98537d9d7e3710c878f7b57de18b6702a71d8bd53c83c4cf6b43c0860b2c7be5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Nov 23 15:54:17 np0005532763 bash[150173]: 98537d9d7e3710c878f7b57de18b6702a71d8bd53c83c4cf6b43c0860b2c7be5
Nov 23 15:54:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:17 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Nov 23 15:54:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:17 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Nov 23 15:54:17 np0005532763 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:54:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:17 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Nov 23 15:54:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:17 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Nov 23 15:54:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:17 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Nov 23 15:54:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:17 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Nov 23 15:54:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:17 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Nov 23 15:54:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:17 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:54:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:18.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:54:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:19.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:54:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:54:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:20.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:21.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:54:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:22.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:54:23 np0005532763 podman[150260]: 2025-11-23 20:54:23.211039732 +0000 UTC m=+0.088878253 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 23 15:54:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:23.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:23 np0005532763 kernel: SELinux:  Converting 2772 SID table entries...
Nov 23 15:54:23 np0005532763 kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 15:54:23 np0005532763 kernel: SELinux:  policy capability open_perms=1
Nov 23 15:54:23 np0005532763 kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 15:54:23 np0005532763 kernel: SELinux:  policy capability always_check_network=0
Nov 23 15:54:23 np0005532763 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 15:54:23 np0005532763 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 15:54:23 np0005532763 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 15:54:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:24 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:54:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:24 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:54:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:54:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:24.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:25.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:26.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:54:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:27.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:54:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:28.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:29.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:54:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:30 np0005532763 dbus-broker-launch[812]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 23 15:54:30 np0005532763 podman[150293]: 2025-11-23 20:54:30.280806399 +0000 UTC m=+0.142752046 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Nov 23 15:54:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:30 : epoch 69237479 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:54:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:30.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:31 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1e0000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:31.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:31 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d40012c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:31 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1bc000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:32.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:33 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:33.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205433 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:54:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:33 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:33 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d40012c0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:54:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000057s ======
Nov 23 15:54:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:34.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Nov 23 15:54:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:35 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:35.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:35 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1bc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:35 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205435 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:54:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:36.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:37 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:37.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:37 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d40023e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:37 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1bc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:38.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:39 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:39.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:39 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:39 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d40023e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:54:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:40.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:41 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1bc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:41.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:41 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:41 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:42.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:43 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d40023e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:43.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:43 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1bc002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:43 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:54:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:44.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:45 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8001ec0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:45.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:45 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d40023e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:45 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1bc002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:46 : epoch 69237479 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:54:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:47.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:47 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1bc002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:47.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:47 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8001ec0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:47 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d40023e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:49.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:49 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1bc002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:49.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:49 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1bc002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:49 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:54:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:49 : epoch 69237479 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:54:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:49 : epoch 69237479 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:54:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:54:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:51.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:54:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:51 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d40023e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:51.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:51 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d40023e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:51 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c40038b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:54:52.204 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 15:54:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:54:52.206 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 15:54:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:54:52.206 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 15:54:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:52 : epoch 69237479 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:54:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:54:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:53.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:54:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:53 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:53.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:53 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1bc003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:53 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d40023e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:54 np0005532763 podman[158179]: 2025-11-23 20:54:54.205417702 +0000 UTC m=+0.081360130 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 23 15:54:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:54:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:55.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:55 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c40038b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:55.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:55 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:55 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1bc003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:56 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 23 15:54:56 np0005532763 podman[159059]: 2025-11-23 20:54:56.024553117 +0000 UTC m=+0.105721818 container exec 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 15:54:56 np0005532763 podman[159059]: 2025-11-23 20:54:56.148786918 +0000 UTC m=+0.229955619 container exec_died 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Nov 23 15:54:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:56 np0005532763 podman[159533]: 2025-11-23 20:54:56.795311521 +0000 UTC m=+0.068138266 container exec bfa89024a4f3a8c3745fbdf8141ab9c1af6ff603988de647c9e7f7e15dff8638 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 15:54:56 np0005532763 podman[159533]: 2025-11-23 20:54:56.805372206 +0000 UTC m=+0.078198961 container exec_died bfa89024a4f3a8c3745fbdf8141ab9c1af6ff603988de647c9e7f7e15dff8638 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 15:54:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:57.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:57 np0005532763 podman[159816]: 2025-11-23 20:54:57.25062373 +0000 UTC m=+0.081614027 container exec 98537d9d7e3710c878f7b57de18b6702a71d8bd53c83c4cf6b43c0860b2c7be5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Nov 23 15:54:57 np0005532763 podman[159816]: 2025-11-23 20:54:57.270873543 +0000 UTC m=+0.101863840 container exec_died 98537d9d7e3710c878f7b57de18b6702a71d8bd53c83c4cf6b43c0860b2c7be5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Nov 23 15:54:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:57 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d40023e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:57 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c40038b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:57.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:57 np0005532763 podman[160022]: 2025-11-23 20:54:57.600367315 +0000 UTC m=+0.094402539 container exec 187afc4c1e67339be091cc4caff41c0e2aaba4673fc086f757180d516596ee6c (image=quay.io/ceph/haproxy:2.3, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem)
Nov 23 15:54:57 np0005532763 podman[160022]: 2025-11-23 20:54:57.616773259 +0000 UTC m=+0.110808423 container exec_died 187afc4c1e67339be091cc4caff41c0e2aaba4673fc086f757180d516596ee6c (image=quay.io/ceph/haproxy:2.3, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem)
Nov 23 15:54:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:57 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c40038b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:57 np0005532763 podman[160230]: 2025-11-23 20:54:57.941809856 +0000 UTC m=+0.083099520 container exec f83166e24f35928d8e85c6352ec69e598c685dd22eb2d34bc93aec691f658844 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., description=keepalived for Ceph, release=1793, name=keepalived, architecture=x86_64, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, io.openshift.expose-services=)
Nov 23 15:54:57 np0005532763 podman[160230]: 2025-11-23 20:54:57.960822223 +0000 UTC m=+0.102111837 container exec_died f83166e24f35928d8e85c6352ec69e598c685dd22eb2d34bc93aec691f658844 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt, io.k8s.display-name=Keepalived on RHEL 9, release=1793, architecture=x86_64, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, name=keepalived, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, vendor=Red Hat, Inc., description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git)
Nov 23 15:54:58 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:54:58 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:54:58 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 23 15:54:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:54:59.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:59 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:54:59 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:54:59 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 23 15:54:59 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:54:59 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:54:59 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:54:59 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:54:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:59 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1bc003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:59 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d40023e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:54:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:54:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:54:59.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:54:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205459 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:54:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:54:59 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c40038b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:54:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:54:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:54:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:54:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:54:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:55:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:01.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:55:01 np0005532763 podman[161815]: 2025-11-23 20:55:01.20731064 +0000 UTC m=+0.095795588 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 23 15:55:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:01 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:01 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1bc003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:01.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:01 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d40023e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:55:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:03.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:55:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:03 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d40023e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:03 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:03.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:03 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:55:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:04 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:55:04 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:55:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:55:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:05.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:55:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:05 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:05 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0000d90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:05.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:05 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:55:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:07.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:55:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:07 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a80016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:07 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc001ac0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:55:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:07.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:55:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:07 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b00018b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:55:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:09.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:55:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:09 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:09 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a80016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:55:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:09.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:55:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:09 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc0027d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:55:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:55:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:11.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:55:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:11 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b00018b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:11 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b00018b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:55:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:11.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:55:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:11 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a80016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:13.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:13 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc0027d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:13 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:13.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:13 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b00018b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:55:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:15.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:15 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:15 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc0027d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:15.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:15 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:17.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:17 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003130 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:17 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:55:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:17.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:55:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:17 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc003c60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205517 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:55:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:55:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:19.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:55:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:19 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:19 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003130 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:19.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:19 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8002b10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:55:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:55:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:21.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:55:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:21 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc003c60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:21 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:55:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:21.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:55:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:21 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000056s ======
Nov 23 15:55:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:23.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Nov 23 15:55:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:23 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4003b00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:23 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc003c60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:55:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:23.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:55:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:23 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:55:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:55:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:25.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:55:25 np0005532763 podman[167938]: 2025-11-23 20:55:25.233701433 +0000 UTC m=+0.102994470 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 23 15:55:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:25 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:25 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4003b20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:25.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:25 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc004970 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:26 : epoch 69237479 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:55:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:27.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:27 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:27 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:55:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:27.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:55:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:27 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:27 np0005532763 kernel: SELinux:  Converting 2773 SID table entries...
Nov 23 15:55:27 np0005532763 kernel: SELinux:  policy capability network_peer_controls=1
Nov 23 15:55:27 np0005532763 kernel: SELinux:  policy capability open_perms=1
Nov 23 15:55:27 np0005532763 kernel: SELinux:  policy capability extended_socket_class=1
Nov 23 15:55:27 np0005532763 kernel: SELinux:  policy capability always_check_network=0
Nov 23 15:55:27 np0005532763 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 23 15:55:27 np0005532763 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 23 15:55:27 np0005532763 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 23 15:55:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:28 np0005532763 dbus-broker-launch[794]: Noticed file-system modification, trigger reload.
Nov 23 15:55:28 np0005532763 dbus-broker-launch[812]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 23 15:55:28 np0005532763 dbus-broker-launch[794]: Noticed file-system modification, trigger reload.
Nov 23 15:55:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:55:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:29.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:55:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:29 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4003b40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:29 : epoch 69237479 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:55:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:29 : epoch 69237479 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:55:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:29 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4003b40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:29.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:29 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4003b40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:55:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:31.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:31 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:31 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc004970 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:31.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:31 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:32 np0005532763 podman[168034]: 2025-11-23 20:55:32.340044155 +0000 UTC m=+0.208443093 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true)
Nov 23 15:55:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:32 : epoch 69237479 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:55:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:55:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:33.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:55:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:33 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4003b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:33 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:55:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:33.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:55:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:33 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc004970 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:55:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:35.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:35 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:35 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4003b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:55:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:35.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:55:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:35 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:37.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:37 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d4001b90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:37 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:37.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:37 np0005532763 systemd[1]: Stopping OpenSSH server daemon...
Nov 23 15:55:37 np0005532763 systemd[1]: sshd.service: Deactivated successfully.
Nov 23 15:55:37 np0005532763 systemd[1]: Stopped OpenSSH server daemon.
Nov 23 15:55:37 np0005532763 systemd[1]: sshd.service: Consumed 3.948s CPU time, read 32.0K from disk, written 28.0K to disk.
Nov 23 15:55:37 np0005532763 systemd[1]: Stopped target sshd-keygen.target.
Nov 23 15:55:37 np0005532763 systemd[1]: Stopping sshd-keygen.target...
Nov 23 15:55:37 np0005532763 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 15:55:37 np0005532763 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 15:55:37 np0005532763 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 23 15:55:37 np0005532763 systemd[1]: Reached target sshd-keygen.target.
Nov 23 15:55:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205537 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:55:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:37 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4003b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:37 np0005532763 systemd[1]: Starting OpenSSH server daemon...
Nov 23 15:55:37 np0005532763 systemd[1]: Started OpenSSH server daemon.
Nov 23 15:55:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:39.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:39 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:39 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d4001d10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:39.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:39 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:55:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:40 np0005532763 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 15:55:40 np0005532763 systemd[1]: Starting man-db-cache-update.service...
Nov 23 15:55:40 np0005532763 systemd[1]: Reloading.
Nov 23 15:55:40 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:55:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:40 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:55:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:40 np0005532763 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 15:55:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:41.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:41 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4003b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:41 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:55:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:41.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:55:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:41 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d4002630 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:55:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:43.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:55:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:43 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:43 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:43.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:43 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:44 np0005532763 python3.9[172066]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 15:55:44 np0005532763 systemd[1]: Reloading.
Nov 23 15:55:44 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:55:44 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:55:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:55:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:45.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:45 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:45 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4003b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:45.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:45 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:45 np0005532763 python3.9[173296]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 15:55:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:45 np0005532763 systemd[1]: Reloading.
Nov 23 15:55:45 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:55:45 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:55:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:46 np0005532763 python3.9[174501]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 15:55:47 np0005532763 systemd[1]: Reloading.
Nov 23 15:55:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:47.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:47 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:55:47 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:55:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:47 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:47 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d4002f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:47.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:47 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4003b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:48 np0005532763 python3.9[175560]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 15:55:48 np0005532763 systemd[1]: Reloading.
Nov 23 15:55:48 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:55:48 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:55:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:55:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:49.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:55:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:49 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:49 np0005532763 python3.9[176728]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:55:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:49 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:49.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:49 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d4002f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:55:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:50 np0005532763 systemd[1]: Reloading.
Nov 23 15:55:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:50 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:55:50 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:55:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:51.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:51 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4003b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:51 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:51.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:51 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:51 np0005532763 python3.9[178491]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:55:51 np0005532763 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 15:55:51 np0005532763 systemd[1]: Finished man-db-cache-update.service.
Nov 23 15:55:51 np0005532763 systemd[1]: man-db-cache-update.service: Consumed 14.528s CPU time.
Nov 23 15:55:51 np0005532763 systemd[1]: Reloading.
Nov 23 15:55:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:52 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:55:52 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:55:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:55:52.206 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 15:55:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:55:52.207 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 15:55:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:55:52.208 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 15:55:52 np0005532763 systemd[1]: run-r9fe3d167fddf40eeb051116a817c4b8b.service: Deactivated successfully.
Nov 23 15:55:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:55:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:53.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:55:53 np0005532763 python3.9[178875]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:55:53 np0005532763 systemd[1]: Reloading.
Nov 23 15:55:53 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:55:53 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:55:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:53 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d4002f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:53 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4003b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:53.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:53 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:54 np0005532763 python3.9[179066]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:55:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:55:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:55.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:55 np0005532763 podman[179222]: 2025-11-23 20:55:55.428207526 +0000 UTC m=+0.119287885 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 15:55:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:55 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:55 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d4004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:55 np0005532763 python3.9[179223]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:55:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:55:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:55.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:55:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:55 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4003b80 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:55 np0005532763 systemd[1]: Reloading.
Nov 23 15:55:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:55 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:55:55 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:55:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:57 np0005532763 python3.9[179433]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 23 15:55:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:55:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:57.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:55:57 np0005532763 systemd[1]: Reloading.
Nov 23 15:55:57 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:55:57 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:55:57 np0005532763 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 23 15:55:57 np0005532763 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 23 15:55:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:57 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4003b80 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:57 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:57.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:57 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d4004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:58 np0005532763 python3.9[179627]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:55:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:55:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:55:59.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:55:59 np0005532763 python3.9[179783]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:55:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:59 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d4004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:59 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1c4003ba0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:55:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:55:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:55:59.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:55:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:55:59 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:55:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:55:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:55:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:55:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:55:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:00 np0005532763 python3.9[179939]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:56:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:01.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:01 np0005532763 python3.9[180120]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:56:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:01 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d4004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:01 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d4004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:56:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:01.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:56:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:01 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:02 np0005532763 python3.9[180277]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:56:02 np0005532763 podman[180279]: 2025-11-23 20:56:02.568540785 +0000 UTC m=+0.149379558 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 23 15:56:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:56:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:03.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:56:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:03 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:03 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:03.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:03 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:04 np0005532763 python3.9[180494]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:56:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:56:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:56:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:05.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:56:05 np0005532763 python3.9[180700]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:56:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:05 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:05 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0003e40 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:05.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:05 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:06 np0005532763 python3.9[180856]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:56:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:56:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:07.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:56:07 np0005532763 python3.9[181012]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:56:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:07 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:07 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d4004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:07.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:07 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc0037c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:08 np0005532763 python3.9[181169]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:56:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:56:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:56:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:56:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:56:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:56:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:56:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:56:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:09.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:56:09 np0005532763 python3.9[181325]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:56:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:09 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8001230 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:09 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:56:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:09.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:56:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:09 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d4004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:56:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:56:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:11.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:56:11 np0005532763 python3.9[181482]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:56:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:11 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc0023e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:11 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8003260 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:11.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:11 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:12 np0005532763 python3.9[181638]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:56:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:56:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:13.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:56:13 np0005532763 python3.9[181819]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 23 15:56:13 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:56:13 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:56:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:13 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:13 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc0023e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:56:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:13.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:56:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:13 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8003260 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:14 np0005532763 python3.9[181975]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:56:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:56:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:56:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:15.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:56:15 np0005532763 python3.9[182128]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:56:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:15 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1d4004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:15 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000054s ======
Nov 23 15:56:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:15.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000054s
Nov 23 15:56:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:15 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc0023e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:16 np0005532763 python3.9[182281]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:56:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:17 np0005532763 python3.9[182434]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:56:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:17.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:17 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8003260 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:17 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b8003260 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:17.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:17 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a8003c90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:17 np0005532763 python3.9[182586]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:56:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:18 np0005532763 python3.9[182739]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:56:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:56:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:19.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:56:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:19 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc0023e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:19 np0005532763 python3.9[182892]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:19 np0005532763 kernel: ganesha.nfsd[181040]: segfault at 50 ip 00007fa29286e32e sp 00007fa265ffa210 error 4 in libntirpc.so.5.8[7fa292853000+2c000] likely on CPU 7 (core 0, socket 7)
Nov 23 15:56:19 np0005532763 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Nov 23 15:56:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[150188]: 23/11/2025 20:56:19 : epoch 69237479 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1dc0023e0 fd 47 proxy ignored for local
Nov 23 15:56:19 np0005532763 systemd[1]: Started Process Core Dump (PID 182894/UID 0).
Nov 23 15:56:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:19.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:56:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:20 np0005532763 python3.9[183020]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763931378.8422909-1624-166691987502835/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:20 np0005532763 systemd-coredump[182896]: Process 150192 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 61:#012#0  0x00007fa29286e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Nov 23 15:56:20 np0005532763 systemd[1]: systemd-coredump@6-182894-0.service: Deactivated successfully.
Nov 23 15:56:20 np0005532763 systemd[1]: systemd-coredump@6-182894-0.service: Consumed 1.193s CPU time.
Nov 23 15:56:20 np0005532763 podman[183201]: 2025-11-23 20:56:20.904724597 +0000 UTC m=+0.031555773 container died 98537d9d7e3710c878f7b57de18b6702a71d8bd53c83c4cf6b43c0860b2c7be5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:56:20 np0005532763 systemd[1]: var-lib-containers-storage-overlay-fe7707439f430e738574a08b880db85dfac10b03d7261373f4cd4f3367589a41-merged.mount: Deactivated successfully.
Nov 23 15:56:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:20 np0005532763 podman[183201]: 2025-11-23 20:56:20.96254466 +0000 UTC m=+0.089375816 container remove 98537d9d7e3710c878f7b57de18b6702a71d8bd53c83c4cf6b43c0860b2c7be5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Nov 23 15:56:20 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Main process exited, code=exited, status=139/n/a
Nov 23 15:56:21 np0005532763 python3.9[183212]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:21.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:21 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Failed with result 'exit-code'.
Nov 23 15:56:21 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.985s CPU time.
Nov 23 15:56:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:21.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:21 np0005532763 python3.9[183370]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763931380.5215416-1624-72377236712808/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:22 np0005532763 python3.9[183523]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:23.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:23 np0005532763 python3.9[183649]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763931381.9332511-1624-258413543835755/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:23.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:24 np0005532763 python3.9[183802]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:24 np0005532763 python3.9[183927]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763931383.4163852-1624-262709225537938/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:56:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:56:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:25.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:56:25 np0005532763 python3.9[184080]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205625 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:56:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:25.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:25 np0005532763 podman[184178]: 2025-11-23 20:56:25.994614153 +0000 UTC m=+0.071851768 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 23 15:56:26 np0005532763 python3.9[184224]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763931384.8836262-1624-1014090052984/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:26 np0005532763 python3.9[184377]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:56:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:27.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:56:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:27.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:27 np0005532763 python3.9[184502]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763931386.374532-1624-21503023847506/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:28 np0005532763 python3.9[184655]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:29.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:29 np0005532763 python3.9[184779]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763931387.922451-1624-99533892108638/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:29.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:56:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:30 np0005532763 python3.9[184932]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:30 np0005532763 python3.9[185057]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763931389.4202986-1624-113822135200454/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:56:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:31.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:56:31 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Scheduled restart job, restart counter is at 7.
Nov 23 15:56:31 np0005532763 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:56:31 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.985s CPU time.
Nov 23 15:56:31 np0005532763 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:56:31 np0005532763 podman[185262]: 2025-11-23 20:56:31.615422761 +0000 UTC m=+0.046470632 container create 8e68cbba1c18ac5016ba5211df2d86d0715523850d14df02b4746fc3336a1cb5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid)
Nov 23 15:56:31 np0005532763 python3.9[185229]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 23 15:56:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:31.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:31 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae1174dc76ce25b6020ef4c598bade4703b3b44a820e32342c50cf85ed11316d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Nov 23 15:56:31 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae1174dc76ce25b6020ef4c598bade4703b3b44a820e32342c50cf85ed11316d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:56:31 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae1174dc76ce25b6020ef4c598bade4703b3b44a820e32342c50cf85ed11316d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:56:31 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae1174dc76ce25b6020ef4c598bade4703b3b44a820e32342c50cf85ed11316d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.dqbktw-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:56:31 np0005532763 podman[185262]: 2025-11-23 20:56:31.595024896 +0000 UTC m=+0.026072747 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:56:31 np0005532763 podman[185262]: 2025-11-23 20:56:31.695011595 +0000 UTC m=+0.126059506 container init 8e68cbba1c18ac5016ba5211df2d86d0715523850d14df02b4746fc3336a1cb5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:56:31 np0005532763 podman[185262]: 2025-11-23 20:56:31.704040176 +0000 UTC m=+0.135088037 container start 8e68cbba1c18ac5016ba5211df2d86d0715523850d14df02b4746fc3336a1cb5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 23 15:56:31 np0005532763 bash[185262]: 8e68cbba1c18ac5016ba5211df2d86d0715523850d14df02b4746fc3336a1cb5
Nov 23 15:56:31 np0005532763 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:56:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:31 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Nov 23 15:56:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:31 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Nov 23 15:56:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:31 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Nov 23 15:56:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:31 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Nov 23 15:56:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:31 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Nov 23 15:56:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:31 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Nov 23 15:56:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:31 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Nov 23 15:56:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:31 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:56:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:32 np0005532763 python3.9[185473]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:33.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:33 np0005532763 podman[185598]: 2025-11-23 20:56:33.167126822 +0000 UTC m=+0.132573859 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 23 15:56:33 np0005532763 python3.9[185643]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:33.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:34 np0005532763 python3.9[185805]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:56:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:34 np0005532763 python3.9[185958]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:35.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:35 np0005532763 python3.9[186110]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:56:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:35.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:56:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:36 np0005532763 python3.9[186263]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:37.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:37 np0005532763 python3.9[186416]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:56:37.625941) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931397625993, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4701, "num_deletes": 502, "total_data_size": 12907384, "memory_usage": 13076952, "flush_reason": "Manual Compaction"}
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931397669373, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 8359352, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13340, "largest_seqno": 18036, "table_properties": {"data_size": 8341630, "index_size": 11976, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 4677, "raw_key_size": 36450, "raw_average_key_size": 19, "raw_value_size": 8305186, "raw_average_value_size": 4482, "num_data_blocks": 524, "num_entries": 1853, "num_filter_entries": 1853, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930950, "oldest_key_time": 1763930950, "file_creation_time": 1763931397, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 43523 microseconds, and 30066 cpu microseconds.
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:56:37.669466) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 8359352 bytes OK
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:56:37.669495) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:56:37.671176) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:56:37.671199) EVENT_LOG_v1 {"time_micros": 1763931397671192, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:56:37.671220) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 12887103, prev total WAL file size 12887103, number of live WAL files 2.
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:56:37.676108) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(8163KB)], [27(12MB)]
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931397676158, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 21794356, "oldest_snapshot_seqno": -1}
Nov 23 15:56:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:37.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 5078 keys, 15937486 bytes, temperature: kUnknown
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931397755475, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 15937486, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15898706, "index_size": 24974, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12741, "raw_key_size": 127041, "raw_average_key_size": 25, "raw_value_size": 15801872, "raw_average_value_size": 3111, "num_data_blocks": 1050, "num_entries": 5078, "num_filter_entries": 5078, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763931397, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:56:37.755874) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 15937486 bytes
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:56:37.757192) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 274.2 rd, 200.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(8.0, 12.8 +0.0 blob) out(15.2 +0.0 blob), read-write-amplify(4.5) write-amplify(1.9) OK, records in: 6100, records dropped: 1022 output_compression: NoCompression
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:56:37.757223) EVENT_LOG_v1 {"time_micros": 1763931397757209, "job": 14, "event": "compaction_finished", "compaction_time_micros": 79477, "compaction_time_cpu_micros": 55057, "output_level": 6, "num_output_files": 1, "total_output_size": 15937486, "num_input_records": 6100, "num_output_records": 5078, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931397760409, "job": 14, "event": "table_file_deletion", "file_number": 29}
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931397765231, "job": 14, "event": "table_file_deletion", "file_number": 27}
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:56:37.676014) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:56:37.765444) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:56:37.765451) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:56:37.765454) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:56:37.765457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:56:37 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:56:37.765460) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:56:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:37 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:56:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:37 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:56:37 np0005532763 python3.9[186569]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:38 np0005532763 python3.9[186721]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:56:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:39.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:56:39 np0005532763 python3.9[186874]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:39.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:56:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:40 np0005532763 python3.9[187027]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:41 np0005532763 python3.9[187205]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:41.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:41.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:41 np0005532763 python3.9[187358]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:42 np0005532763 python3.9[187510]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:43.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:43 np0005532763 python3.9[187663]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:43.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:43 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:56:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:43 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Nov 23 15:56:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:43 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Nov 23 15:56:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:43 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Nov 23 15:56:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:43 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Nov 23 15:56:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:43 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Nov 23 15:56:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:43 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Nov 23 15:56:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:43 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:56:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:43 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:56:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:43 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:44 : epoch 692374ff : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:56:44 np0005532763 python3.9[187799]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931403.0147388-2287-198559994748208/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:45.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:45 np0005532763 python3.9[187952]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:45 : epoch 692374ff : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c7c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:45 : epoch 692374ff : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c70001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:45.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:45 : epoch 692374ff : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c70001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:45 np0005532763 python3.9[188078]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931404.621309-2287-16839479334730/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:46 np0005532763 python3.9[188230]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000026s ======
Nov 23 15:56:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:47.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Nov 23 15:56:47 np0005532763 python3.9[188354]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931406.027666-2287-11912391946860/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205647 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:56:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:47 : epoch 692374ff : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c78001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:47 : epoch 692374ff : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c78001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000053s ======
Nov 23 15:56:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:47.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000053s
Nov 23 15:56:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:47 : epoch 692374ff : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c58000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:56:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:48 np0005532763 python3.9[188507]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:48 np0005532763 python3.9[188631]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931407.5735965-2287-274839082562330/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:56:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:49.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:56:49 np0005532763 kernel: ganesha.nfsd[187743]: segfault at 50 ip 00007f0d2935332e sp 00007f0cef7fd210 error 4 in libntirpc.so.5.8[7f0d29338000+2c000] likely on CPU 1 (core 0, socket 1)
Nov 23 15:56:49 np0005532763 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Nov 23 15:56:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[185280]: 23/11/2025 20:56:49 : epoch 692374ff : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f0c70001c00 fd 38 proxy ignored for local
Nov 23 15:56:49 np0005532763 systemd[1]: Started Process Core Dump (PID 188784/UID 0).
Nov 23 15:56:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:56:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:49.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:56:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:56:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:49 np0005532763 python3.9[188783]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:50 np0005532763 python3.9[188909]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931409.1590602-2287-167623351582217/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:50 np0005532763 systemd-coredump[188786]: Process 185293 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 43:#012#0  0x00007f0d2935332e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Nov 23 15:56:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:50 np0005532763 systemd[1]: systemd-coredump@7-188784-0.service: Deactivated successfully.
Nov 23 15:56:50 np0005532763 systemd[1]: systemd-coredump@7-188784-0.service: Consumed 1.141s CPU time.
Nov 23 15:56:50 np0005532763 podman[189014]: 2025-11-23 20:56:50.937492212 +0000 UTC m=+0.052361368 container died 8e68cbba1c18ac5016ba5211df2d86d0715523850d14df02b4746fc3336a1cb5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 23 15:56:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:50 np0005532763 systemd[1]: var-lib-containers-storage-overlay-ae1174dc76ce25b6020ef4c598bade4703b3b44a820e32342c50cf85ed11316d-merged.mount: Deactivated successfully.
Nov 23 15:56:50 np0005532763 podman[189014]: 2025-11-23 20:56:50.988091033 +0000 UTC m=+0.102960129 container remove 8e68cbba1c18ac5016ba5211df2d86d0715523850d14df02b4746fc3336a1cb5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Nov 23 15:56:50 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Main process exited, code=exited, status=139/n/a
Nov 23 15:56:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:51.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:51 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Failed with result 'exit-code'.
Nov 23 15:56:51 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.656s CPU time.
Nov 23 15:56:51 np0005532763 python3.9[189094]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:51.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:51 np0005532763 python3.9[189231]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931410.7067192-2287-229670491853332/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:56:52.208 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 15:56:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:56:52.209 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 15:56:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:56:52.209 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 15:56:52 np0005532763 python3.9[189383]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:53.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:53 np0005532763 python3.9[189507]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931412.153207-2287-63683226069262/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:53.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:54 np0005532763 python3.9[189660]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:56:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:54 np0005532763 python3.9[189784]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931413.6042945-2287-175888930661134/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:55.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205655 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:56:55 np0005532763 python3.9[189936]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:55.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:56 np0005532763 podman[190032]: 2025-11-23 20:56:56.209517761 +0000 UTC m=+0.088513723 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 23 15:56:56 np0005532763 python3.9[190080]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931415.0691445-2287-70839926878606/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:57.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:57 np0005532763 python3.9[190233]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:57.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:57 np0005532763 python3.9[190357]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931416.6053345-2287-66236583110356/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:58 np0005532763 python3.9[190509]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:56:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:56:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:56:59.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:56:59 np0005532763 python3.9[190633]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931418.0857968-2287-216646028671116/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:56:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:56:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:56:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:56:59.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:56:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:56:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:56:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:56:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:56:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:00 np0005532763 python3.9[190786]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:57:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:00 np0005532763 python3.9[190934]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931419.5620854-2287-103067455452724/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:01.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:01 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Scheduled restart job, restart counter is at 8.
Nov 23 15:57:01 np0005532763 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:57:01 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.656s CPU time.
Nov 23 15:57:01 np0005532763 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:57:01 np0005532763 python3.9[191110]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:57:01 np0005532763 podman[191135]: 2025-11-23 20:57:01.627624545 +0000 UTC m=+0.071797369 container create 1728af6d53e57d3f8b3f3f8131144b88067da973514629af244789841842ab0d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 15:57:01 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6aa81a5907dcb568d3c07d8c743292b16e023028bf1f10e0c6c04a0729a49242/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Nov 23 15:57:01 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6aa81a5907dcb568d3c07d8c743292b16e023028bf1f10e0c6c04a0729a49242/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:57:01 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6aa81a5907dcb568d3c07d8c743292b16e023028bf1f10e0c6c04a0729a49242/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:57:01 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6aa81a5907dcb568d3c07d8c743292b16e023028bf1f10e0c6c04a0729a49242/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.dqbktw-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:57:01 np0005532763 podman[191135]: 2025-11-23 20:57:01.596984124 +0000 UTC m=+0.041157008 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:57:01 np0005532763 podman[191135]: 2025-11-23 20:57:01.69186788 +0000 UTC m=+0.136040744 container init 1728af6d53e57d3f8b3f3f8131144b88067da973514629af244789841842ab0d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 15:57:01 np0005532763 podman[191135]: 2025-11-23 20:57:01.704106814 +0000 UTC m=+0.148279608 container start 1728af6d53e57d3f8b3f3f8131144b88067da973514629af244789841842ab0d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Nov 23 15:57:01 np0005532763 bash[191135]: 1728af6d53e57d3f8b3f3f8131144b88067da973514629af244789841842ab0d
Nov 23 15:57:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:01 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Nov 23 15:57:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:01 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Nov 23 15:57:01 np0005532763 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:57:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:01.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:01 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Nov 23 15:57:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:01 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Nov 23 15:57:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:01 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Nov 23 15:57:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:01 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Nov 23 15:57:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:01 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Nov 23 15:57:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:01 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:57:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:02 np0005532763 python3.9[191316]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931420.9855223-2287-1645776846286/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:03 np0005532763 python3.9[191469]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:57:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:03.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:03 np0005532763 podman[191564]: 2025-11-23 20:57:03.688358019 +0000 UTC m=+0.131739163 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 23 15:57:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:03.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:03 np0005532763 python3.9[191613]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931422.5519114-2287-148806499250269/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:04 np0005532763 python3.9[191769]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:57:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:57:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:05.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:05 np0005532763 python3.9[191925]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 23 15:57:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:05.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:07.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:07.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:07 np0005532763 dbus-broker-launch[812]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 23 15:57:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:07 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:57:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:07 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:57:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:08 np0005532763 python3.9[192084]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:08 np0005532763 python3.9[192237]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:09.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:57:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:09.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:09 np0005532763 python3.9[192390]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:10 np0005532763 python3.9[192542]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:57:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:11.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:57:11 np0005532763 python3.9[192695]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:11.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:57:12.508991) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931432509072, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 546, "num_deletes": 252, "total_data_size": 894695, "memory_usage": 905632, "flush_reason": "Manual Compaction"}
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931432514400, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 411643, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18041, "largest_seqno": 18582, "table_properties": {"data_size": 409057, "index_size": 622, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6605, "raw_average_key_size": 19, "raw_value_size": 403893, "raw_average_value_size": 1191, "num_data_blocks": 28, "num_entries": 339, "num_filter_entries": 339, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763931398, "oldest_key_time": 1763931398, "file_creation_time": 1763931432, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 5454 microseconds, and 3014 cpu microseconds.
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:57:12.514450) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 411643 bytes OK
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:57:12.514473) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:57:12.515841) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:57:12.515862) EVENT_LOG_v1 {"time_micros": 1763931432515855, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:57:12.515884) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 891569, prev total WAL file size 891569, number of live WAL files 2.
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:57:12.516901) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323531' seq:72057594037927935, type:22 .. '6D67727374617400353034' seq:0, type:0; will stop at (end)
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(401KB)], [30(15MB)]
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931432516953, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 16349129, "oldest_snapshot_seqno": -1}
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4916 keys, 12435120 bytes, temperature: kUnknown
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931432572384, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 12435120, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12401628, "index_size": 20070, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12357, "raw_key_size": 124059, "raw_average_key_size": 25, "raw_value_size": 12311841, "raw_average_value_size": 2504, "num_data_blocks": 836, "num_entries": 4916, "num_filter_entries": 4916, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763931432, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:57:12.572747) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 12435120 bytes
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:57:12.574408) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 294.2 rd, 223.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 15.2 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(69.9) write-amplify(30.2) OK, records in: 5417, records dropped: 501 output_compression: NoCompression
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:57:12.574433) EVENT_LOG_v1 {"time_micros": 1763931432574422, "job": 16, "event": "compaction_finished", "compaction_time_micros": 55572, "compaction_time_cpu_micros": 23721, "output_level": 6, "num_output_files": 1, "total_output_size": 12435120, "num_input_records": 5417, "num_output_records": 4916, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931432574699, "job": 16, "event": "table_file_deletion", "file_number": 32}
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931432578645, "job": 16, "event": "table_file_deletion", "file_number": 30}
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:57:12.516834) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:57:12.578745) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:57:12.578752) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:57:12.578755) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:57:12.578758) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:57:12 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:57:12.578760) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:57:12 np0005532763 python3.9[192848]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:13.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:13 np0005532763 python3.9[193059]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:13.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:13 : epoch 6923751d : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:57:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:14 np0005532763 python3.9[193249]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:14 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:57:14 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:57:14 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:57:14 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:57:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:57:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:14 np0005532763 python3.9[193402]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:15.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[191156]: 23/11/2025 20:57:15 : epoch 6923751d : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5364000df0 fd 38 proxy ignored for local
Nov 23 15:57:15 np0005532763 kernel: ganesha.nfsd[193221]: segfault at 50 ip 00007f54121b332e sp 00007f53d67fb210 error 4 in libntirpc.so.5.8[7f5412198000+2c000] likely on CPU 3 (core 0, socket 3)
Nov 23 15:57:15 np0005532763 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Nov 23 15:57:15 np0005532763 systemd[1]: Started Process Core Dump (PID 193555/UID 0).
Nov 23 15:57:15 np0005532763 python3.9[193554]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:15.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:16 np0005532763 python3.9[193709]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:57:16 np0005532763 systemd[1]: Reloading.
Nov 23 15:57:16 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:57:16 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:57:16 np0005532763 systemd-coredump[193557]: Process 191179 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 47:#012#0  0x00007f54121b332e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Nov 23 15:57:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:16 np0005532763 podman[193749]: 2025-11-23 20:57:16.898805833 +0000 UTC m=+0.050007356 container died 1728af6d53e57d3f8b3f3f8131144b88067da973514629af244789841842ab0d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Nov 23 15:57:16 np0005532763 systemd[1]: systemd-coredump@8-193555-0.service: Deactivated successfully.
Nov 23 15:57:16 np0005532763 systemd[1]: systemd-coredump@8-193555-0.service: Consumed 1.118s CPU time.
Nov 23 15:57:16 np0005532763 systemd[1]: var-lib-containers-storage-overlay-6aa81a5907dcb568d3c07d8c743292b16e023028bf1f10e0c6c04a0729a49242-merged.mount: Deactivated successfully.
Nov 23 15:57:16 np0005532763 systemd[1]: Starting libvirt logging daemon socket...
Nov 23 15:57:16 np0005532763 podman[193749]: 2025-11-23 20:57:16.96346015 +0000 UTC m=+0.114661573 container remove 1728af6d53e57d3f8b3f3f8131144b88067da973514629af244789841842ab0d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Nov 23 15:57:16 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Main process exited, code=exited, status=139/n/a
Nov 23 15:57:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:16 np0005532763 systemd[1]: Listening on libvirt logging daemon socket.
Nov 23 15:57:16 np0005532763 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 23 15:57:16 np0005532763 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 23 15:57:17 np0005532763 systemd[1]: Starting libvirt logging daemon...
Nov 23 15:57:17 np0005532763 systemd[1]: Started libvirt logging daemon.
Nov 23 15:57:17 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Failed with result 'exit-code'.
Nov 23 15:57:17 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.639s CPU time.
Nov 23 15:57:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:17.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:17.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:18 np0005532763 python3.9[193953]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:57:18 np0005532763 systemd[1]: Reloading.
Nov 23 15:57:18 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:57:18 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:57:18 np0005532763 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 23 15:57:18 np0005532763 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 23 15:57:18 np0005532763 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 23 15:57:18 np0005532763 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 23 15:57:18 np0005532763 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 23 15:57:18 np0005532763 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 23 15:57:18 np0005532763 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 23 15:57:18 np0005532763 systemd[1]: Starting libvirt nodedev daemon...
Nov 23 15:57:18 np0005532763 systemd[1]: Started libvirt nodedev daemon.
Nov 23 15:57:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:18 np0005532763 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 23 15:57:18 np0005532763 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 23 15:57:18 np0005532763 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 23 15:57:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:18 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:57:18 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:57:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:19.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:19 np0005532763 python3.9[194204]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:57:19 np0005532763 systemd[1]: Reloading.
Nov 23 15:57:19 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:57:19 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:57:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:57:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:19.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:19 np0005532763 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 23 15:57:19 np0005532763 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 23 15:57:19 np0005532763 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 23 15:57:19 np0005532763 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 23 15:57:19 np0005532763 systemd[1]: Starting libvirt proxy daemon...
Nov 23 15:57:19 np0005532763 setroubleshoot[194014]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 0f9fc671-ce92-498b-8d42-c21a931ef35c
Nov 23 15:57:19 np0005532763 setroubleshoot[194014]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 23 15:57:19 np0005532763 setroubleshoot[194014]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 0f9fc671-ce92-498b-8d42-c21a931ef35c
Nov 23 15:57:19 np0005532763 systemd[1]: Started libvirt proxy daemon.
Nov 23 15:57:19 np0005532763 setroubleshoot[194014]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 23 15:57:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:20 np0005532763 python3.9[194431]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:57:20 np0005532763 systemd[1]: Reloading.
Nov 23 15:57:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:20 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:57:20 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:57:21 np0005532763 systemd[1]: Listening on libvirt locking daemon socket.
Nov 23 15:57:21 np0005532763 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 23 15:57:21 np0005532763 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 23 15:57:21 np0005532763 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 23 15:57:21 np0005532763 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 23 15:57:21 np0005532763 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 23 15:57:21 np0005532763 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 23 15:57:21 np0005532763 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 23 15:57:21 np0005532763 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 23 15:57:21 np0005532763 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 23 15:57:21 np0005532763 systemd[1]: Starting libvirt QEMU daemon...
Nov 23 15:57:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:21.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:21 np0005532763 systemd[1]: Started libvirt QEMU daemon.
Nov 23 15:57:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:21.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:22 np0005532763 python3.9[194660]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:57:22 np0005532763 systemd[1]: Reloading.
Nov 23 15:57:22 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:57:22 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:57:22 np0005532763 systemd[1]: Starting libvirt secret daemon socket...
Nov 23 15:57:22 np0005532763 systemd[1]: Listening on libvirt secret daemon socket.
Nov 23 15:57:22 np0005532763 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 23 15:57:22 np0005532763 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 23 15:57:22 np0005532763 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 23 15:57:22 np0005532763 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 23 15:57:22 np0005532763 systemd[1]: Starting libvirt secret daemon...
Nov 23 15:57:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:22 np0005532763 systemd[1]: Started libvirt secret daemon.
Nov 23 15:57:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:23.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205723 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:57:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:23.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:23 np0005532763 python3.9[194873]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:24 np0005532763 python3.9[195025]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 23 15:57:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:57:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:25.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:25 np0005532763 python3.9[195178]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:57:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:25.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:26 np0005532763 python3.9[195333]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 23 15:57:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:27 np0005532763 podman[195411]: 2025-11-23 20:57:27.22455639 +0000 UTC m=+0.093175640 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 15:57:27 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Scheduled restart job, restart counter is at 9.
Nov 23 15:57:27 np0005532763 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:57:27 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.639s CPU time.
Nov 23 15:57:27 np0005532763 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:57:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:27.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:27 np0005532763 python3.9[195528]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:57:27 np0005532763 podman[195549]: 2025-11-23 20:57:27.588095385 +0000 UTC m=+0.061844589 container create 62010a342be948dd43c994cb101b9faf1544c01bc7db64d530f8484046610931 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 15:57:27 np0005532763 podman[195549]: 2025-11-23 20:57:27.558950296 +0000 UTC m=+0.032699550 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:57:27 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e0a0f4cb573b9d31a6218b78f7d5606096d7a7aa591225d732c1f79d9e01179/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Nov 23 15:57:27 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e0a0f4cb573b9d31a6218b78f7d5606096d7a7aa591225d732c1f79d9e01179/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:57:27 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e0a0f4cb573b9d31a6218b78f7d5606096d7a7aa591225d732c1f79d9e01179/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:57:27 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e0a0f4cb573b9d31a6218b78f7d5606096d7a7aa591225d732c1f79d9e01179/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.dqbktw-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:57:27 np0005532763 podman[195549]: 2025-11-23 20:57:27.678335911 +0000 UTC m=+0.152085175 container init 62010a342be948dd43c994cb101b9faf1544c01bc7db64d530f8484046610931 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True)
Nov 23 15:57:27 np0005532763 podman[195549]: 2025-11-23 20:57:27.688247969 +0000 UTC m=+0.161997173 container start 62010a342be948dd43c994cb101b9faf1544c01bc7db64d530f8484046610931 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 23 15:57:27 np0005532763 bash[195549]: 62010a342be948dd43c994cb101b9faf1544c01bc7db64d530f8484046610931
Nov 23 15:57:27 np0005532763 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:57:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:27 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Nov 23 15:57:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:27 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Nov 23 15:57:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:27.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:27 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Nov 23 15:57:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:27 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Nov 23 15:57:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:27 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Nov 23 15:57:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:27 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Nov 23 15:57:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:27 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Nov 23 15:57:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:27 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:57:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:28 np0005532763 python3.9[195728]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763931446.9704895-3362-21615294424046/.source.xml follow=False _original_basename=secret.xml.j2 checksum=2095b2efdb764c083af64051baa9ed5d4618fea0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:28 np0005532763 python3.9[195881]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 03808be8-ae4a-5548-82e6-4a294f1bc627#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:57:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:29.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:57:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:29.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:29 np0005532763 python3.9[196044]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:29 np0005532763 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 23 15:57:29 np0005532763 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 23 15:57:30 np0005532763 auditd[711]: Audit daemon rotating log files
Nov 23 15:57:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:31.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:31.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:32 np0005532763 python3.9[196509]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:33.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:33 np0005532763 python3.9[196662]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:57:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:33.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:33 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:57:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:33 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:57:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:34 np0005532763 podman[196758]: 2025-11-23 20:57:34.075561044 +0000 UTC m=+0.143679811 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 23 15:57:34 np0005532763 python3.9[196803]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763931452.900615-3527-186529480832320/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:57:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:35 np0005532763 python3.9[196963]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:35.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:35.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:36 np0005532763 python3.9[197116]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:57:36 np0005532763 python3.9[197194]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:36 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 15:57:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:36 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:57:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:36 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:57:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:36 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:57:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:37.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:37 np0005532763 python3.9[197347]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:57:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:37.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:37 np0005532763 python3.9[197426]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.eson9bby recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:38 np0005532763 python3.9[197578]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:57:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:39 np0005532763 python3.9[197657]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:39.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:39.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:57:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:40 np0005532763 python3.9[197821]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:57:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:41 np0005532763 python3[198000]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 23 15:57:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:41.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:41 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd638000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:41 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:41 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:41.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:42 np0005532763 python3.9[198157]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:57:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:42 np0005532763 python3.9[198235]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:43.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205743 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:57:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:43 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:43 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:43 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:43 np0005532763 python3.9[198388]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:57:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:43.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:44 np0005532763 python3.9[198467]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:57:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:45.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:45 np0005532763 python3.9[198620]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:57:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:45 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:45 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd618001140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205745 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:57:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:45 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6300029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:45.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:46 np0005532763 python3.9[198699]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:46 np0005532763 python3.9[198852]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:57:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:47.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:47 np0005532763 python3.9[198930]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:47 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:47 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:47 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd618001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:47.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:48 np0005532763 python3.9[199083]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:57:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:49.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:49 np0005532763 python3.9[199209]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763931467.7703297-3902-211592727434633/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:49 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6300029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:49 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:57:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:49 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6140010a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:49.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:50 np0005532763 python3.9[199362]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:51 np0005532763 python3.9[199515]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:57:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:51.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:51 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd618001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:51 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6300029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:51 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:51.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:51 np0005532763 python3.9[199671]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:57:52.210 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 15:57:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:57:52.211 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 15:57:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:57:52.211 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 15:57:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:53 np0005532763 python3.9[199824]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:57:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:53.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:53 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd614001bc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:53 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd618001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:53 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:53.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:54 np0005532763 python3.9[199978]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:57:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:57:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:54 np0005532763 python3.9[200133]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:57:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:55.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:55 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:55 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd614001bc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:55 np0005532763 python3.9[200288]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:55 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:55.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:56 np0005532763 python3.9[200441]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:57:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:57 np0005532763 python3.9[200565]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763931476.0185566-4117-144754327190417/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:57.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:57 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:57 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:57 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd614001bc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:57.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:58 np0005532763 podman[200690]: 2025-11-23 20:57:58.020568968 +0000 UTC m=+0.069441704 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 15:57:58 np0005532763 python3.9[200735]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:57:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:58 np0005532763 python3.9[200860]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763931477.618579-4163-147961089194683/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:57:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:57:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:57:59.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:57:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:59 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:59 np0005532763 python3.9[201012]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:57:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:59 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:57:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:57:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:57:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:57:59 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:57:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:57:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:57:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:57:59.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:57:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:57:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:00 np0005532763 python3.9[201136]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763931479.0714924-4208-255147528208593/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:58:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:01 np0005532763 python3.9[201314]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:58:01 np0005532763 systemd[1]: Reloading.
Nov 23 15:58:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:01.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:01 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:58:01 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:58:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:01 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd614003050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:01 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:01 np0005532763 systemd[1]: Reached target edpm_libvirt.target.
Nov 23 15:58:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:01 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:01.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:02 np0005532763 python3.9[201507]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 23 15:58:02 np0005532763 systemd[1]: Reloading.
Nov 23 15:58:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:02 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:58:02 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:58:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:03 np0005532763 systemd[1]: Reloading.
Nov 23 15:58:03 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:58:03 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:58:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:03.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:03 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:03 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd614003050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:03 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:03.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:03 np0005532763 systemd[1]: session-52.scope: Deactivated successfully.
Nov 23 15:58:03 np0005532763 systemd[1]: session-52.scope: Consumed 4min 11.913s CPU time.
Nov 23 15:58:03 np0005532763 systemd-logind[830]: Session 52 logged out. Waiting for processes to exit.
Nov 23 15:58:03 np0005532763 systemd-logind[830]: Removed session 52.
Nov 23 15:58:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:58:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:05 np0005532763 podman[201607]: 2025-11-23 20:58:05.277953237 +0000 UTC m=+0.153127765 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 15:58:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:05.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:05 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:05 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:05 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd614003050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:05.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:07.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:07 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:07 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:07 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:07.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:09.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:09 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd614003050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:58:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:09 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:09 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630003e40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:09.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:09 np0005532763 systemd-logind[830]: New session 53 of user zuul.
Nov 23 15:58:09 np0005532763 systemd[1]: Started Session 53 of User zuul.
Nov 23 15:58:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:10 np0005532763 python3.9[201794]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:58:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:11.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:11 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:11 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd608000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:11 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd60c000e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:11.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:12 np0005532763 python3.9[201950]: ansible-ansible.builtin.service_facts Invoked
Nov 23 15:58:12 np0005532763 network[201967]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 15:58:12 np0005532763 network[201968]: 'network-scripts' will be removed from distribution in near future.
Nov 23 15:58:12 np0005532763 network[201969]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 15:58:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:13.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:13 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd60c000e00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205813 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:58:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:13 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd608000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:13 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630003e40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:13.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:58:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:15.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:15 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd600000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:15 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:15 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.003000083s ======
Nov 23 15:58:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:15.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000083s
Nov 23 15:58:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:17.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:17 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630004b50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:17 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd600001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:17 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:17.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:18 np0005532763 python3.9[202247]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 23 15:58:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:19.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:19 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:19 np0005532763 python3.9[202412]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:58:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:58:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:19 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630004b50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:19 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd600001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:19.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:21.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:21 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:21 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:21 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630004b50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:21.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:23.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:23 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:58:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:23 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd600001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:23 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:23 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:23.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:24 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:58:24 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:58:24 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:58:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:58:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:25.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:25 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630004b50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:25 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630004b50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:25 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:25.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:26 : epoch 69237537 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:58:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:27.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:27 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:27 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd600002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:27 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630004b50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:27.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:27 np0005532763 python3.9[202599]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:58:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:28 np0005532763 podman[202624]: 2025-11-23 20:58:28.211590623 +0000 UTC m=+0.089580087 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 23 15:58:28 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:58:28 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:58:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:28 np0005532763 python3.9[202799]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:58:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:29.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:29 : epoch 69237537 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:58:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:29 : epoch 69237537 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:58:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:29 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630004b50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:58:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:29 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:29 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd600002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:29.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:29 np0005532763 python3.9[202953]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:58:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:30 np0005532763 python3.9[203106]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:58:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:31.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:31 np0005532763 python3.9[203259]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:58:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:31 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c003cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:31 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630004b50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:31 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:31.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:32 np0005532763 python3.9[203383]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763931511.063258-247-103791725619782/.source.iscsi _original_basename=.lqhsf2ub follow=False checksum=378ca78468db870189bbb6a8f27e45376ef30288 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:58:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:32 : epoch 69237537 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:58:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:33 np0005532763 python3.9[203536]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:58:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:33.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:33 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6000039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:33 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c003cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:33 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630004b50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:33.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:34 np0005532763 python3.9[203689]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:58:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:58:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:35.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:35 np0005532763 python3.9[203842]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:58:35 np0005532763 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 23 15:58:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:35 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:35 np0005532763 podman[203844]: 2025-11-23 20:58:35.697151597 +0000 UTC m=+0.143177077 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 23 15:58:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:35 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6000039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:35 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6000039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:35.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:36 np0005532763 python3.9[204026]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:58:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:36 np0005532763 systemd[1]: Reloading.
Nov 23 15:58:36 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:58:36 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:58:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:37 np0005532763 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 23 15:58:37 np0005532763 systemd[1]: Starting Open-iSCSI...
Nov 23 15:58:37 np0005532763 kernel: Loading iSCSI transport class v2.0-870.
Nov 23 15:58:37 np0005532763 systemd[1]: Started Open-iSCSI.
Nov 23 15:58:37 np0005532763 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 23 15:58:37 np0005532763 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 23 15:58:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:37.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:37 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630004b50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205837 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:58:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:37 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:37 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6000039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:37.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:38 np0005532763 python3.9[204228]: ansible-ansible.builtin.service_facts Invoked
Nov 23 15:58:38 np0005532763 network[204245]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 15:58:38 np0005532763 network[204246]: 'network-scripts' will be removed from distribution in near future.
Nov 23 15:58:38 np0005532763 network[204247]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 15:58:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:39.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6000039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:58:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630004b50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:39 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:39.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:41.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:41 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6000039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:41 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd62c003cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:41 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630004b50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:41.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:43.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:43 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:43 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6000039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:43 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6000039c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:43.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:43 np0005532763 python3.9[204550]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 15:58:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:58:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:45 np0005532763 python3.9[204703]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 23 15:58:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.003000084s ======
Nov 23 15:58:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:45.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000084s
Nov 23 15:58:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:45 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630004b50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:45 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:45 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd60c000fa0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:45.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:46 np0005532763 python3.9[204863]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:58:46 np0005532763 python3.9[204986]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763931525.4060745-479-81984200857762/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:58:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:58:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:47.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:58:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:47 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd608000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:47 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630004b50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:47 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:47 np0005532763 python3.9[205140]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:58:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:47.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:49 np0005532763 python3.9[205293]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:58:49 np0005532763 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 23 15:58:49 np0005532763 systemd[1]: Stopped Load Kernel Modules.
Nov 23 15:58:49 np0005532763 systemd[1]: Stopping Load Kernel Modules...
Nov 23 15:58:49 np0005532763 systemd[1]: Starting Load Kernel Modules...
Nov 23 15:58:49 np0005532763 systemd[1]: Finished Load Kernel Modules.
Nov 23 15:58:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:49.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:49 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd60c000fa0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:58:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:49 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd6180041f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:49 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd608001840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:58:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:49.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:49 np0005532763 python3.9[205450]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:58:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:50 np0005532763 python3.9[205603]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:58:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:51.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:51 np0005532763 kernel: ganesha.nfsd[197783]: segfault at 50 ip 00007fd6e2eea32e sp 00007fd6ae7fb210 error 4 in libntirpc.so.5.8[7fd6e2ecf000+2c000] likely on CPU 1 (core 0, socket 1)
Nov 23 15:58:51 np0005532763 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Nov 23 15:58:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[195583]: 23/11/2025 20:58:51 : epoch 69237537 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd630004b50 fd 48 proxy ignored for local
Nov 23 15:58:51 np0005532763 systemd[1]: Started Process Core Dump (PID 205681/UID 0).
Nov 23 15:58:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000057s ======
Nov 23 15:58:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:51.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Nov 23 15:58:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:52 np0005532763 python3.9[205758]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:58:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:58:52.211 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 15:58:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:58:52.211 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 15:58:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:58:52.211 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 15:58:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:52 np0005532763 systemd-coredump[205684]: Process 195598 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 42:#012#0  0x00007fd6e2eea32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Nov 23 15:58:52 np0005532763 systemd[1]: systemd-coredump@9-205681-0.service: Deactivated successfully.
Nov 23 15:58:52 np0005532763 systemd[1]: systemd-coredump@9-205681-0.service: Consumed 1.133s CPU time.
Nov 23 15:58:52 np0005532763 podman[205916]: 2025-11-23 20:58:52.949945704 +0000 UTC m=+0.039144436 container died 62010a342be948dd43c994cb101b9faf1544c01bc7db64d530f8484046610931 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:58:52 np0005532763 python3.9[205911]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:58:52 np0005532763 systemd[1]: var-lib-containers-storage-overlay-9e0a0f4cb573b9d31a6218b78f7d5606096d7a7aa591225d732c1f79d9e01179-merged.mount: Deactivated successfully.
Nov 23 15:58:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:53 np0005532763 podman[205916]: 2025-11-23 20:58:53.030064966 +0000 UTC m=+0.119263678 container remove 62010a342be948dd43c994cb101b9faf1544c01bc7db64d530f8484046610931 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Nov 23 15:58:53 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Main process exited, code=exited, status=139/n/a
Nov 23 15:58:53 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Failed with result 'exit-code'.
Nov 23 15:58:53 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.808s CPU time.
Nov 23 15:58:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:53.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:53 np0005532763 python3.9[206081]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763931532.3682687-653-41578439931089/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:58:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:53.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:54 np0005532763 python3.9[206234]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:58:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:58:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:55 np0005532763 python3.9[206388]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:58:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:55.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:55.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:56 np0005532763 python3.9[206541]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:58:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:57 np0005532763 python3.9[206694]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:58:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:58:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:57.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:58:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205857 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:58:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:57.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:58 np0005532763 python3.9[206847]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:58:58 np0005532763 podman[206972]: 2025-11-23 20:58:58.77541233 +0000 UTC m=+0.075339169 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 15:58:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:58 np0005532763 python3.9[207019]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:58:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:58:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:58:59.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:58:59 np0005532763 python3.9[207172]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:58:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:58:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:58:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:58:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:58:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:58:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:58:59.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:58:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:58:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:00 np0005532763 python3.9[207325]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:59:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:00 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:01 np0005532763 python3.9[207503]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:59:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:01.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:01.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:01 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:02 np0005532763 python3.9[207658]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:59:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:02 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:03 np0005532763 python3.9[207811]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:59:03 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Scheduled restart job, restart counter is at 10.
Nov 23 15:59:03 np0005532763 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:59:03 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.808s CPU time.
Nov 23 15:59:03 np0005532763 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 15:59:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:03.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:03 np0005532763 podman[207958]: 2025-11-23 20:59:03.715596514 +0000 UTC m=+0.068180339 container create 490d1748570c07ef29ecfa055dcc4ea82939c25d2b53254145a6bfca7e93b892 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 15:59:03 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/589e58fa50aee587a574f23e3af4bf8c71b02a905483be707feca0a45202b065/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Nov 23 15:59:03 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/589e58fa50aee587a574f23e3af4bf8c71b02a905483be707feca0a45202b065/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 15:59:03 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/589e58fa50aee587a574f23e3af4bf8c71b02a905483be707feca0a45202b065/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:59:03 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/589e58fa50aee587a574f23e3af4bf8c71b02a905483be707feca0a45202b065/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.dqbktw-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 15:59:03 np0005532763 podman[207958]: 2025-11-23 20:59:03.689573546 +0000 UTC m=+0.042157441 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 15:59:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:03 np0005532763 podman[207958]: 2025-11-23 20:59:03.793466903 +0000 UTC m=+0.146050748 container init 490d1748570c07ef29ecfa055dcc4ea82939c25d2b53254145a6bfca7e93b892 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=squid)
Nov 23 15:59:03 np0005532763 podman[207958]: 2025-11-23 20:59:03.809467831 +0000 UTC m=+0.162051636 container start 490d1748570c07ef29ecfa055dcc4ea82939c25d2b53254145a6bfca7e93b892 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 15:59:03 np0005532763 bash[207958]: 490d1748570c07ef29ecfa055dcc4ea82939c25d2b53254145a6bfca7e93b892
Nov 23 15:59:03 np0005532763 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 15:59:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:03 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Nov 23 15:59:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:03 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Nov 23 15:59:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:03.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:03 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Nov 23 15:59:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:03 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Nov 23 15:59:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:03 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Nov 23 15:59:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:03 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Nov 23 15:59:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:03 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Nov 23 15:59:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:03 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:59:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:03 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:04 np0005532763 python3.9[208030]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:59:04 np0005532763 python3.9[208145]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:59:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:59:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:04 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:05 np0005532763 python3.9[208298]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:59:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:05.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:05 np0005532763 podman[208377]: 2025-11-23 20:59:05.865873829 +0000 UTC m=+0.102506699 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 23 15:59:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:05.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:05 np0005532763 python3.9[208378]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:59:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:05 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:06 np0005532763 python3.9[208555]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:59:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:06 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:07.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:07 np0005532763 python3.9[208708]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:59:07.747244) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931547747316, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1302, "num_deletes": 255, "total_data_size": 3217770, "memory_usage": 3273584, "flush_reason": "Manual Compaction"}
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931547762482, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2105783, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18587, "largest_seqno": 19884, "table_properties": {"data_size": 2100198, "index_size": 2977, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11231, "raw_average_key_size": 18, "raw_value_size": 2089134, "raw_average_value_size": 3470, "num_data_blocks": 133, "num_entries": 602, "num_filter_entries": 602, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763931433, "oldest_key_time": 1763931433, "file_creation_time": 1763931547, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 15308 microseconds, and 9174 cpu microseconds.
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:59:07.762547) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2105783 bytes OK
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:59:07.762574) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:59:07.764139) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:59:07.764173) EVENT_LOG_v1 {"time_micros": 1763931547764163, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:59:07.764197) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 3211677, prev total WAL file size 3211677, number of live WAL files 2.
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:59:07.766031) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323532' seq:0, type:0; will stop at (end)
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(2056KB)], [33(11MB)]
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931547766093, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 14540903, "oldest_snapshot_seqno": -1}
Nov 23 15:59:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4994 keys, 14063808 bytes, temperature: kUnknown
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931547830987, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 14063808, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14028759, "index_size": 21435, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12549, "raw_key_size": 126833, "raw_average_key_size": 25, "raw_value_size": 13936595, "raw_average_value_size": 2790, "num_data_blocks": 881, "num_entries": 4994, "num_filter_entries": 4994, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763931547, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:59:07.831255) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 14063808 bytes
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:59:07.832689) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 223.8 rd, 216.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 11.9 +0.0 blob) out(13.4 +0.0 blob), read-write-amplify(13.6) write-amplify(6.7) OK, records in: 5518, records dropped: 524 output_compression: NoCompression
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:59:07.832709) EVENT_LOG_v1 {"time_micros": 1763931547832700, "job": 18, "event": "compaction_finished", "compaction_time_micros": 64980, "compaction_time_cpu_micros": 35497, "output_level": 6, "num_output_files": 1, "total_output_size": 14063808, "num_input_records": 5518, "num_output_records": 4994, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931547833262, "job": 18, "event": "table_file_deletion", "file_number": 35}
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931547835985, "job": 18, "event": "table_file_deletion", "file_number": 33}
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:59:07.765919) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:59:07.836224) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:59:07.836233) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:59:07.836237) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:59:07.836240) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:59:07 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-20:59:07.836243) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 15:59:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:07.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:07 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:08 np0005532763 python3.9[208787]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:59:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:08 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:09 np0005532763 python3.9[208940]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:59:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:09.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:09 np0005532763 python3.9[209019]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:59:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:59:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:09.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:09 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:59:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:09 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:59:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:09 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:10 np0005532763 python3.9[209173]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:59:10 np0005532763 systemd[1]: Reloading.
Nov 23 15:59:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:10 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:11 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:59:11 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:59:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:59:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:11.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:59:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:11.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:11 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:12 np0005532763 python3.9[209363]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:59:12 np0005532763 python3.9[209441]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:59:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:12 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:13.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:13 np0005532763 python3.9[209594]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:59:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:59:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:13.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:59:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:13 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:14 np0005532763 python3.9[209673]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:59:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:59:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:14 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:15 np0005532763 python3.9[209826]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:59:15 np0005532763 systemd[1]: Reloading.
Nov 23 15:59:15 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:59:15 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:59:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:15.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:15 np0005532763 systemd[1]: Starting Create netns directory...
Nov 23 15:59:15 np0005532763 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 23 15:59:15 np0005532763 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 23 15:59:15 np0005532763 systemd[1]: Finished Create netns directory.
Nov 23 15:59:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000056s ======
Nov 23 15:59:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:15.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Nov 23 15:59:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:15 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:59:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:15 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Nov 23 15:59:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:15 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Nov 23 15:59:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:15 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Nov 23 15:59:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:15 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Nov 23 15:59:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:15 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Nov 23 15:59:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:15 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Nov 23 15:59:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:15 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:59:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:15 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:59:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:15 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:59:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:15 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:16 : epoch 69237597 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 15:59:16 np0005532763 python3.9[210032]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:16 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:17.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:17 np0005532763 python3.9[210185]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:59:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:17 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca60000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:17 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca54001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:17 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca3c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:17.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:17 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:18 np0005532763 python3.9[210311]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763931556.9374304-1274-229677310188829/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:59:18 np0005532763 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 23 15:59:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:18 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:19.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:19 np0005532763 python3.9[210465]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 15:59:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205919 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:59:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:19 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca50000f90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:59:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:19 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:19 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca54001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:19.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:19 np0005532763 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 23 15:59:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:19 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:20 np0005532763 python3.9[210621]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:59:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:20 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:21 np0005532763 python3.9[210745]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763931559.8038428-1348-232086197388734/.source.json _original_basename=.k27088ba follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:59:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:21.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:21 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca50001ab0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:21 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:21 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:21 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca38000e00 fd 50 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:21 np0005532763 python3.9[210923]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:59:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:21.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:22 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:23.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:23 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca3c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:23 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:23 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:23 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:23.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:59:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:24 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:24 np0005532763 python3.9[211353]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 23 15:59:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:25.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:25 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca500023d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:25 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:25 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca3c001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:25 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30001ce0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:25.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:25 np0005532763 python3.9[211506]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 15:59:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:26 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:27 np0005532763 python3.9[211659]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 23 15:59:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:27.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:27 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca500023d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:27 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:27 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca38000e00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:27 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30001ce0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:27.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:28 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:28 np0005532763 podman[211905]: 2025-11-23 20:59:28.969207933 +0000 UTC m=+0.067270923 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 15:59:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:29 np0005532763 python3[211908]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 15:59:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:29.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:29 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca38000e00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:59:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:29 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:29 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30001ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:29 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c002950 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:29.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:30 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 15:59:30 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:59:30 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:59:30 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 15:59:30 np0005532763 podman[211954]: 2025-11-23 20:59:30.45587752 +0000 UTC m=+1.150452850 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24
Nov 23 15:59:30 np0005532763 podman[212012]: 2025-11-23 20:59:30.611635518 +0000 UTC m=+0.050768222 container create 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd)
Nov 23 15:59:30 np0005532763 podman[212012]: 2025-11-23 20:59:30.588323856 +0000 UTC m=+0.027456570 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24
Nov 23 15:59:30 np0005532763 python3[211908]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24
Nov 23 15:59:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:30 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:31.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:31 np0005532763 python3.9[212201]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:59:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:31 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca50003330 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:31 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:31 np0005532763 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 23 15:59:31 np0005532763 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 23 15:59:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205931 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:59:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:31 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca38002240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:31 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca38002240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:31.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:32 np0005532763 python3.9[212358]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:59:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:32 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:33 np0005532763 python3.9[212435]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:59:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:33.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:33 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c003270 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:33 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:33 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca50003330 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:33 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca38002240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:33.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:34 np0005532763 python3.9[212587]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763931573.2719765-1612-276748015805639/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:59:34 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:59:34 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 15:59:34 np0005532763 python3.9[212688]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 15:59:34 np0005532763 systemd[1]: Reloading.
Nov 23 15:59:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:59:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:34 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:34 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:59:34 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:59:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:35.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:35 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca50003330 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:35 np0005532763 python3.9[212799]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 15:59:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:35 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:35 np0005532763 systemd[1]: Reloading.
Nov 23 15:59:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:35 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:35 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:35.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:35 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:59:35 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:59:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:36 np0005532763 systemd[1]: Starting multipathd container...
Nov 23 15:59:36 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:59:36 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32e8ea086fa2743b8e214863a14f418f756e590cd78d0d8cf410b3c3a3ff6c7a/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 23 15:59:36 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32e8ea086fa2743b8e214863a14f418f756e590cd78d0d8cf410b3c3a3ff6c7a/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 23 15:59:36 np0005532763 systemd[1]: Started /usr/bin/podman healthcheck run 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152.
Nov 23 15:59:36 np0005532763 podman[212842]: 2025-11-23 20:59:36.380053156 +0000 UTC m=+0.140527953 container init 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 15:59:36 np0005532763 multipathd[212876]: + sudo -E kolla_set_configs
Nov 23 15:59:36 np0005532763 podman[212840]: 2025-11-23 20:59:36.393241695 +0000 UTC m=+0.152904269 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 23 15:59:36 np0005532763 podman[212842]: 2025-11-23 20:59:36.414246823 +0000 UTC m=+0.174721610 container start 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 15:59:36 np0005532763 podman[212842]: multipathd
Nov 23 15:59:36 np0005532763 systemd[1]: Started multipathd container.
Nov 23 15:59:36 np0005532763 multipathd[212876]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 15:59:36 np0005532763 multipathd[212876]: INFO:__main__:Validating config file
Nov 23 15:59:36 np0005532763 multipathd[212876]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 15:59:36 np0005532763 multipathd[212876]: INFO:__main__:Writing out command to execute
Nov 23 15:59:36 np0005532763 podman[212890]: 2025-11-23 20:59:36.509400125 +0000 UTC m=+0.080593006 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 23 15:59:36 np0005532763 systemd[1]: 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152-1089cd93a06912c9.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 15:59:36 np0005532763 systemd[1]: 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152-1089cd93a06912c9.service: Failed with result 'exit-code'.
Nov 23 15:59:36 np0005532763 multipathd[212876]: ++ cat /run_command
Nov 23 15:59:36 np0005532763 multipathd[212876]: + CMD='/usr/sbin/multipathd -d'
Nov 23 15:59:36 np0005532763 multipathd[212876]: + ARGS=
Nov 23 15:59:36 np0005532763 multipathd[212876]: + sudo kolla_copy_cacerts
Nov 23 15:59:36 np0005532763 multipathd[212876]: + [[ ! -n '' ]]
Nov 23 15:59:36 np0005532763 multipathd[212876]: + . kolla_extend_start
Nov 23 15:59:36 np0005532763 multipathd[212876]: Running command: '/usr/sbin/multipathd -d'
Nov 23 15:59:36 np0005532763 multipathd[212876]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 23 15:59:36 np0005532763 multipathd[212876]: + umask 0022
Nov 23 15:59:36 np0005532763 multipathd[212876]: + exec /usr/sbin/multipathd -d
Nov 23 15:59:36 np0005532763 multipathd[212876]: 3526.152036 | --------start up--------
Nov 23 15:59:36 np0005532763 multipathd[212876]: 3526.152053 | read /etc/multipath.conf
Nov 23 15:59:36 np0005532763 multipathd[212876]: 3526.159645 | path checkers start up
Nov 23 15:59:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:36 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:37 np0005532763 python3.9[213073]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 15:59:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:37.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:37 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca38002240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:37 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:37 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:37 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:37.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:38 np0005532763 python3.9[213228]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 15:59:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:38 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:39 np0005532763 python3.9[213394]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:59:39 np0005532763 systemd[1]: Stopping multipathd container...
Nov 23 15:59:39 np0005532763 multipathd[212876]: 3528.995456 | exit (signal)
Nov 23 15:59:39 np0005532763 multipathd[212876]: 3528.996211 | --------shut down-------
Nov 23 15:59:39 np0005532763 systemd[1]: libpod-43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152.scope: Deactivated successfully.
Nov 23 15:59:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:39.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:39 np0005532763 podman[213398]: 2025-11-23 20:59:39.447821711 +0000 UTC m=+0.104349221 container died 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 15:59:39 np0005532763 systemd[1]: 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152-1089cd93a06912c9.timer: Deactivated successfully.
Nov 23 15:59:39 np0005532763 systemd[1]: Stopped /usr/bin/podman healthcheck run 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152.
Nov 23 15:59:39 np0005532763 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152-userdata-shm.mount: Deactivated successfully.
Nov 23 15:59:39 np0005532763 systemd[1]: var-lib-containers-storage-overlay-32e8ea086fa2743b8e214863a14f418f756e590cd78d0d8cf410b3c3a3ff6c7a-merged.mount: Deactivated successfully.
Nov 23 15:59:39 np0005532763 podman[213398]: 2025-11-23 20:59:39.650046259 +0000 UTC m=+0.306573769 container cleanup 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 23 15:59:39 np0005532763 podman[213398]: multipathd
Nov 23 15:59:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:39 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca50004430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:39 np0005532763 podman[213427]: multipathd
Nov 23 15:59:39 np0005532763 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 23 15:59:39 np0005532763 systemd[1]: Stopped multipathd container.
Nov 23 15:59:39 np0005532763 systemd[1]: Starting multipathd container...
Nov 23 15:59:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:59:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:39 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:39 np0005532763 systemd[1]: Started libcrun container.
Nov 23 15:59:39 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32e8ea086fa2743b8e214863a14f418f756e590cd78d0d8cf410b3c3a3ff6c7a/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 23 15:59:39 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32e8ea086fa2743b8e214863a14f418f756e590cd78d0d8cf410b3c3a3ff6c7a/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 23 15:59:39 np0005532763 systemd[1]: Started /usr/bin/podman healthcheck run 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152.
Nov 23 15:59:39 np0005532763 podman[213439]: 2025-11-23 20:59:39.862502554 +0000 UTC m=+0.122543660 container init 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 15:59:39 np0005532763 multipathd[213454]: + sudo -E kolla_set_configs
Nov 23 15:59:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:39 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca38002240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:39 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:39 np0005532763 podman[213439]: 2025-11-23 20:59:39.894303164 +0000 UTC m=+0.154344280 container start 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 15:59:39 np0005532763 podman[213439]: multipathd
Nov 23 15:59:39 np0005532763 systemd[1]: Started multipathd container.
Nov 23 15:59:39 np0005532763 multipathd[213454]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 15:59:39 np0005532763 multipathd[213454]: INFO:__main__:Validating config file
Nov 23 15:59:39 np0005532763 multipathd[213454]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 15:59:39 np0005532763 multipathd[213454]: INFO:__main__:Writing out command to execute
Nov 23 15:59:39 np0005532763 multipathd[213454]: ++ cat /run_command
Nov 23 15:59:39 np0005532763 multipathd[213454]: + CMD='/usr/sbin/multipathd -d'
Nov 23 15:59:39 np0005532763 multipathd[213454]: + ARGS=
Nov 23 15:59:39 np0005532763 multipathd[213454]: + sudo kolla_copy_cacerts
Nov 23 15:59:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:39.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:39 np0005532763 multipathd[213454]: + [[ ! -n '' ]]
Nov 23 15:59:39 np0005532763 multipathd[213454]: + . kolla_extend_start
Nov 23 15:59:39 np0005532763 multipathd[213454]: Running command: '/usr/sbin/multipathd -d'
Nov 23 15:59:39 np0005532763 multipathd[213454]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 23 15:59:39 np0005532763 multipathd[213454]: + umask 0022
Nov 23 15:59:39 np0005532763 multipathd[213454]: + exec /usr/sbin/multipathd -d
Nov 23 15:59:40 np0005532763 multipathd[213454]: 3529.595467 | --------start up--------
Nov 23 15:59:40 np0005532763 multipathd[213454]: 3529.595496 | read /etc/multipath.conf
Nov 23 15:59:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:40 np0005532763 multipathd[213454]: 3529.604283 | path checkers start up
Nov 23 15:59:40 np0005532763 podman[213461]: 2025-11-23 20:59:40.016644307 +0000 UTC m=+0.109993989 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 15:59:40 np0005532763 systemd[1]: 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152-7db44d8003710047.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 15:59:40 np0005532763 systemd[1]: 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152-7db44d8003710047.service: Failed with result 'exit-code'.
Nov 23 15:59:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:40 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:40 np0005532763 python3.9[213646]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:59:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:41 : epoch 69237597 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 15:59:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 15:59:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:41.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 15:59:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:41 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:41 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:41 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:41 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca50004430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:41.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:42 np0005532763 python3.9[213824]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 23 15:59:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:42 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:42 np0005532763 python3.9[213977]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 23 15:59:42 np0005532763 kernel: Key type psk registered
Nov 23 15:59:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:59:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:43.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:59:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:43 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca38003980 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:43 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:43 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:43 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:43 np0005532763 python3.9[214141]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 15:59:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:43.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:44 : epoch 69237597 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 15:59:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:44 : epoch 69237597 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 15:59:44 np0005532763 python3.9[214264]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763931583.2504888-1853-52949268451768/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:59:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:59:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:44 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:45.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:45 np0005532763 python3.9[214417]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:59:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:45 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca50004430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:45 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:45 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca38003980 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:45 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:45.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:46 np0005532763 python3.9[214570]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:59:46 np0005532763 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 23 15:59:46 np0005532763 systemd[1]: Stopped Load Kernel Modules.
Nov 23 15:59:46 np0005532763 systemd[1]: Stopping Load Kernel Modules...
Nov 23 15:59:46 np0005532763 systemd[1]: Starting Load Kernel Modules...
Nov 23 15:59:46 np0005532763 systemd[1]: Finished Load Kernel Modules.
Nov 23 15:59:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:46 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:47 : epoch 69237597 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 15:59:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:47.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:47 np0005532763 python3.9[214727]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 23 15:59:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:47 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:47 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:47 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca50004430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:47 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca38003980 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:47.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:48 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:49.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:49 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:49 np0005532763 systemd[1]: Reloading.
Nov 23 15:59:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:59:49 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:59:49 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:59:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:49 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:49 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:49 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca50004430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:49.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:50 np0005532763 systemd[1]: Reloading.
Nov 23 15:59:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:50 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:59:50 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:59:50 np0005532763 systemd-logind[830]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 23 15:59:50 np0005532763 systemd-logind[830]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 23 15:59:50 np0005532763 lvm[214845]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 23 15:59:50 np0005532763 lvm[214845]: VG ceph_vg0 finished
Nov 23 15:59:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:50 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:50 np0005532763 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 23 15:59:50 np0005532763 systemd[1]: Starting man-db-cache-update.service...
Nov 23 15:59:50 np0005532763 systemd[1]: Reloading.
Nov 23 15:59:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:51 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:59:51 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:59:51 np0005532763 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 23 15:59:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 15:59:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:51.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 15:59:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:51 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca38003980 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:51 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205951 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 15:59:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:51 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:51 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca54001080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:51.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:59:52.212 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 15:59:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:59:52.213 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 15:59:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 20:59:52.213 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 15:59:52 np0005532763 python3.9[216036]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 15:59:52 np0005532763 systemd[1]: Stopping Open-iSCSI...
Nov 23 15:59:52 np0005532763 iscsid[204068]: iscsid shutting down.
Nov 23 15:59:52 np0005532763 systemd[1]: iscsid.service: Deactivated successfully.
Nov 23 15:59:52 np0005532763 systemd[1]: Stopped Open-iSCSI.
Nov 23 15:59:52 np0005532763 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 23 15:59:52 np0005532763 systemd[1]: Starting Open-iSCSI...
Nov 23 15:59:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:52 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:52 np0005532763 systemd[1]: Started Open-iSCSI.
Nov 23 15:59:52 np0005532763 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 23 15:59:52 np0005532763 systemd[1]: Finished man-db-cache-update.service.
Nov 23 15:59:52 np0005532763 systemd[1]: man-db-cache-update.service: Consumed 2.373s CPU time.
Nov 23 15:59:52 np0005532763 systemd[1]: run-re77b5f75944a4faeaf5d27ecb6fa2209.service: Deactivated successfully.
Nov 23 15:59:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:53.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:53 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:53 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:53 np0005532763 python3.9[216344]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 23 15:59:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/205953 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 15:59:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:53 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:53 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 15:59:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:53.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 15:59:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:59:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:54 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:55 np0005532763 python3.9[216502]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 15:59:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:55.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:55 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca54001080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:55 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:55 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:55 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:55.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:56 np0005532763 python3.9[216655]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 15:59:56 np0005532763 systemd[1]: Reloading.
Nov 23 15:59:56 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 15:59:56 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 15:59:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:56 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:57.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:57 np0005532763 python3.9[216841]: ansible-ansible.builtin.service_facts Invoked
Nov 23 15:59:57 np0005532763 network[216859]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 23 15:59:57 np0005532763 network[216860]: 'network-scripts' will be removed from distribution in near future.
Nov 23 15:59:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:57 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:57 np0005532763 network[216861]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 23 15:59:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:57 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:57 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca54002260 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:57 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:57.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:58 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 20:59:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:20:59:59.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 15:59:59 np0005532763 podman[216868]: 2025-11-23 20:59:59.670171526 +0000 UTC m=+0.076509963 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 23 15:59:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:59 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 15:59:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 20:59:59 2025: (VI_0) received an invalid passwd!
Nov 23 15:59:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:59 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 20:59:59 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 15:59:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 15:59:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 15:59:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:20:59:59.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:00 np0005532763 ceph-mon[75752]: overall HEALTH_OK
Nov 23 16:00:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:00 : epoch 69237597 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:00:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:00:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:01.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:00:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:01 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca3c000ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:01 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca38003980 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:01 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca54002b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:01.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:03.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:03 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca5c003b90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:03 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca3c000ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:03 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca38003980 fd 49 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:03 : epoch 69237597 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:00:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:03 : epoch 69237597 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:00:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:03.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:04 np0005532763 python3.9[217187]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 16:00:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:00:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:05 np0005532763 python3.9[217344]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 16:00:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:05.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:05 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca3c000ea0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:05 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca2c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:05 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca24000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:05.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:06 np0005532763 python3.9[217498]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 16:00:06 np0005532763 podman[217623]: 2025-11-23 21:00:06.761116917 +0000 UTC m=+0.139415725 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 23 16:00:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:06 : epoch 69237597 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 16:00:07 np0005532763 python3.9[217669]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 16:00:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:07.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:07 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30003e60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:07 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca3c002750 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:07 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca2c0016a0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:07 np0005532763 python3.9[217829]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 16:00:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:07.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:08 np0005532763 python3.9[217983]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 16:00:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:09.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:09 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30003e60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:00:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:09 np0005532763 python3.9[218137]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 16:00:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:09 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca240016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:09 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca2c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:09.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:10 np0005532763 podman[218187]: 2025-11-23 21:00:10.188350318 +0000 UTC m=+0.066176205 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Nov 23 16:00:10 np0005532763 python3.9[218311]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 16:00:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:11.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:11 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca54002b80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:11 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:11 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca240016a0 fd 50 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:11 np0005532763 python3.9[218466]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:00:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:11.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:12 np0005532763 python3.9[218618]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:00:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:00:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:13.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:00:13 np0005532763 python3.9[218771]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:00:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:13 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca2c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210013 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 16:00:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:13 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca3c002750 fd 49 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:13 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca240016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:13.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:14 np0005532763 python3.9[218924]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:00:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:00:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:15 np0005532763 python3.9[219077]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:00:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:15.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:15 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:15 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca240016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:15 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca2c0016a0 fd 50 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:15.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:16 np0005532763 python3.9[219230]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:00:16 np0005532763 python3.9[219382]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:00:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:17 np0005532763 python3.9[219535]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:00:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:00:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:17.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:00:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:17 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca54003ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:17 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca3c002750 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:17 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca2c0016a0 fd 50 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:17.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:18 np0005532763 python3.9[219688]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:00:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:19 np0005532763 python3.9[219841]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:00:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:19.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:19 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca3c002750 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:00:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:19 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:19 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca24002f00 fd 50 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:19.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:20 np0005532763 python3.9[219994]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:00:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:20 np0005532763 python3.9[220147]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:00:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:21.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:21 np0005532763 python3.9[220299]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:00:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:21 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca2c003470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:21 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca54003ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:21 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:22.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:22 np0005532763 python3.9[220477]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:00:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:23 np0005532763 python3.9[220630]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:00:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:23.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:23 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca24002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:23 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca2c003470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:23 np0005532763 python3.9[220783]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:00:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:23 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca54003ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:24.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:00:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:24 np0005532763 python3.9[220936]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 16:00:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:25.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:25 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca24003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:25 np0005532763 python3.9[221089]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 23 16:00:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:25 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:25 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca2c003d90 fd 49 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:26.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:26 np0005532763 python3.9[221242]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 16:00:26 np0005532763 systemd[1]: Reloading.
Nov 23 16:00:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:27 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 16:00:27 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 16:00:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:27.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:27 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca24003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:27 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca3c003c40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:27 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:28.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:28 np0005532763 python3.9[221430]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 16:00:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:28 np0005532763 python3.9[221584]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 16:00:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:29.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:29 np0005532763 python3.9[221737]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 16:00:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:29 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:00:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:29 np0005532763 podman[221740]: 2025-11-23 21:00:29.815396317 +0000 UTC m=+0.092613221 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 23 16:00:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:29 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:29 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca24003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:30.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:00:30.194149) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931630194207, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1001, "num_deletes": 251, "total_data_size": 2266698, "memory_usage": 2313904, "flush_reason": "Manual Compaction"}
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931630206861, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 1496311, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19889, "largest_seqno": 20885, "table_properties": {"data_size": 1491820, "index_size": 2143, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 9799, "raw_average_key_size": 19, "raw_value_size": 1482875, "raw_average_value_size": 2953, "num_data_blocks": 96, "num_entries": 502, "num_filter_entries": 502, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763931548, "oldest_key_time": 1763931548, "file_creation_time": 1763931630, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 12807 microseconds, and 8234 cpu microseconds.
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:00:30.206936) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 1496311 bytes OK
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:00:30.206981) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:00:30.208573) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:00:30.208595) EVENT_LOG_v1 {"time_micros": 1763931630208588, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:00:30.208616) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 2261804, prev total WAL file size 2261804, number of live WAL files 2.
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:00:30.209824) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(1461KB)], [36(13MB)]
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931630209877, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 15560119, "oldest_snapshot_seqno": -1}
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4980 keys, 13378632 bytes, temperature: kUnknown
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931630280551, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 13378632, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13344269, "index_size": 20813, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12485, "raw_key_size": 127120, "raw_average_key_size": 25, "raw_value_size": 13252758, "raw_average_value_size": 2661, "num_data_blocks": 854, "num_entries": 4980, "num_filter_entries": 4980, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763931630, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:00:30.280759) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 13378632 bytes
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:00:30.282246) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 220.0 rd, 189.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 13.4 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(19.3) write-amplify(8.9) OK, records in: 5496, records dropped: 516 output_compression: NoCompression
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:00:30.282288) EVENT_LOG_v1 {"time_micros": 1763931630282257, "job": 20, "event": "compaction_finished", "compaction_time_micros": 70729, "compaction_time_cpu_micros": 46860, "output_level": 6, "num_output_files": 1, "total_output_size": 13378632, "num_input_records": 5496, "num_output_records": 4980, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931630282827, "job": 20, "event": "table_file_deletion", "file_number": 38}
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931630285544, "job": 20, "event": "table_file_deletion", "file_number": 36}
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:00:30.209686) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:00:30.285633) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:00:30.285639) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:00:30.285643) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:00:30.285646) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:00:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:00:30.285648) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:00:30 np0005532763 python3.9[221911]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 16:00:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:31 np0005532763 python3.9[222065]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 16:00:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:31.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:31 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca3c003c40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:31 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:31 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:31 np0005532763 python3.9[222219]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 16:00:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:32.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:32 np0005532763 python3.9[222372]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 16:00:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:33.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:33 np0005532763 python3.9[222526]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 23 16:00:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:33 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca24003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:33 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca3c003c40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:33 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:34.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:00:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:35.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:35 np0005532763 python3.9[222762]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:00:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:35 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca2c003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:35 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:35 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca20000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:35 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:00:35 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:00:35 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:00:35 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:00:35 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:00:35 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:00:35 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:00:35 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:00:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:36.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:36 np0005532763 python3.9[222916]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:00:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:37 np0005532763 podman[223041]: 2025-11-23 21:00:37.031470492 +0000 UTC m=+0.183274757 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=ovn_controller)
Nov 23 16:00:37 np0005532763 python3.9[223089]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:00:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:37.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:37 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca24003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:37 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca2c003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:37 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30004000 fd 50 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:38.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:38 np0005532763 python3.9[223249]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:00:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:38 np0005532763 python3.9[223403]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:00:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:39.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:39 np0005532763 python3.9[223555]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:00:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:39 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca200016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:00:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:39 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca50001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:39 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30004000 fd 49 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:40.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:40 np0005532763 python3.9[223708]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:00:40 np0005532763 podman[223810]: 2025-11-23 21:00:40.779645734 +0000 UTC m=+0.104157853 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 16:00:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:41 np0005532763 python3.9[223905]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:00:41 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:00:41 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:00:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:41.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:41 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca2c003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:41 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca24003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:41 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30004000 fd 49 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:41 np0005532763 python3.9[224083]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:00:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:42.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:42 np0005532763 python3.9[224235]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:00:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:43.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:43 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca50001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:43 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca200016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:43 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca24003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:44.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:00:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:45.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:45 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca30004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:45 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca50001fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:45 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca200016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:00:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:46.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:47.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[207997]: 23/11/2025 21:00:47 : epoch 69237597 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fca24003c10 fd 39 proxy ignored for local
Nov 23 16:00:47 np0005532763 kernel: ganesha.nfsd[217188]: segfault at 50 ip 00007fcb0f0cc32e sp 00007fcac77fd210 error 4 in libntirpc.so.5.8[7fcb0f0b1000+2c000] likely on CPU 3 (core 0, socket 3)
Nov 23 16:00:47 np0005532763 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Nov 23 16:00:47 np0005532763 systemd[1]: Started Process Core Dump (PID 224295/UID 0).
Nov 23 16:00:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:48.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:48 np0005532763 python3.9[224395]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 23 16:00:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:48 np0005532763 systemd-coredump[224310]: Process 208021 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 58:#012#0  0x00007fcb0f0cc32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Nov 23 16:00:49 np0005532763 systemd[1]: systemd-coredump@10-224295-0.service: Deactivated successfully.
Nov 23 16:00:49 np0005532763 systemd[1]: systemd-coredump@10-224295-0.service: Consumed 1.210s CPU time.
Nov 23 16:00:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:49 np0005532763 podman[224514]: 2025-11-23 21:00:49.099480029 +0000 UTC m=+0.046396833 container died 490d1748570c07ef29ecfa055dcc4ea82939c25d2b53254145a6bfca7e93b892 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Nov 23 16:00:49 np0005532763 systemd[1]: var-lib-containers-storage-overlay-589e58fa50aee587a574f23e3af4bf8c71b02a905483be707feca0a45202b065-merged.mount: Deactivated successfully.
Nov 23 16:00:49 np0005532763 podman[224514]: 2025-11-23 21:00:49.166213689 +0000 UTC m=+0.113130473 container remove 490d1748570c07ef29ecfa055dcc4ea82939c25d2b53254145a6bfca7e93b892 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:00:49 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Main process exited, code=exited, status=139/n/a
Nov 23 16:00:49 np0005532763 python3.9[224568]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 23 16:00:49 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Failed with result 'exit-code'.
Nov 23 16:00:49 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.982s CPU time.
Nov 23 16:00:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:49.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:00:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:50.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:50 np0005532763 ceph-osd[78269]: bluestore.MempoolThread fragmentation_score=0.000025 took=0.000041s
Nov 23 16:00:50 np0005532763 python3.9[224755]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 23 16:00:50 np0005532763 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 16:00:50 np0005532763 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 16:00:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:51.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:51 np0005532763 systemd-logind[830]: New session 54 of user zuul.
Nov 23 16:00:51 np0005532763 systemd[1]: Started Session 54 of User zuul.
Nov 23 16:00:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:51 np0005532763 systemd[1]: session-54.scope: Deactivated successfully.
Nov 23 16:00:51 np0005532763 systemd-logind[830]: Session 54 logged out. Waiting for processes to exit.
Nov 23 16:00:51 np0005532763 systemd-logind[830]: Removed session 54.
Nov 23 16:00:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:52.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:00:52.213 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:00:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:00:52.213 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:00:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:00:52.213 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:00:52 np0005532763 python3.9[224944]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 16:00:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:53 np0005532763 python3.9[225066]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763931652.1694615-3436-107498347152036/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:00:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:53.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210053 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 16:00:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:54.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:54 np0005532763 python3.9[225217]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 16:00:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:00:54 np0005532763 python3.9[225293]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:00:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:55 np0005532763 python3.9[225444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 16:00:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:55.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:56.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:56 np0005532763 python3.9[225566]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763931654.9381847-3436-141236661161565/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:00:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:56 np0005532763 python3.9[225717]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 16:00:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:00:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:57.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:00:57 np0005532763 python3.9[225838]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763931656.370361-3436-73466585217753/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:00:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:00:58.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:58 np0005532763 python3.9[225989]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 16:00:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:59 np0005532763 python3.9[226111]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763931657.779215-3436-265868314065742/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:00:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:00:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:59 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Scheduled restart job, restart counter is at 11.
Nov 23 16:00:59 np0005532763 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 16:00:59 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.982s CPU time.
Nov 23 16:00:59 np0005532763 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 16:00:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:00:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:00:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:00:59.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:00:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:00:59 np0005532763 python3.9[226273]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 16:00:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:00:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:00:59 np0005532763 podman[226313]: 2025-11-23 21:00:59.90337002 +0000 UTC m=+0.075058972 container create 06144f1ce175232afcd561ac501aa1d44f8ef1cbc2c9d8b325de9d696d452536 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.build-date=20250325, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 23 16:00:59 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/30af3d2a1160fd3c7d355cd96ae31faba94c3f2f1db0b177016541b1efa8bf35/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Nov 23 16:00:59 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/30af3d2a1160fd3c7d355cd96ae31faba94c3f2f1db0b177016541b1efa8bf35/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 16:00:59 np0005532763 podman[226313]: 2025-11-23 21:00:59.86997286 +0000 UTC m=+0.041661852 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 16:00:59 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/30af3d2a1160fd3c7d355cd96ae31faba94c3f2f1db0b177016541b1efa8bf35/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 16:00:59 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/30af3d2a1160fd3c7d355cd96ae31faba94c3f2f1db0b177016541b1efa8bf35/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.dqbktw-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 16:00:59 np0005532763 podman[226313]: 2025-11-23 21:00:59.990528159 +0000 UTC m=+0.162217121 container init 06144f1ce175232afcd561ac501aa1d44f8ef1cbc2c9d8b325de9d696d452536 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Nov 23 16:00:59 np0005532763 podman[226313]: 2025-11-23 21:00:59.997225545 +0000 UTC m=+0.168914497 container start 06144f1ce175232afcd561ac501aa1d44f8ef1cbc2c9d8b325de9d696d452536 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 16:01:00 np0005532763 bash[226313]: 06144f1ce175232afcd561ac501aa1d44f8ef1cbc2c9d8b325de9d696d452536
Nov 23 16:01:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:00 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Nov 23 16:01:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:00 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Nov 23 16:01:00 np0005532763 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 16:01:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:00 np0005532763 podman[226364]: 2025-11-23 21:01:00.043419432 +0000 UTC m=+0.094593766 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 23 16:01:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:00.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:00 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Nov 23 16:01:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:00 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Nov 23 16:01:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:00 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Nov 23 16:01:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:00 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Nov 23 16:01:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:00 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Nov 23 16:01:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:00 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:01:00 np0005532763 python3.9[226506]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763931659.2119508-3436-43425588792122/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:01:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:01 np0005532763 python3.9[226659]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:01:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:01:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:01.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:01:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:01:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:02.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:01:02 np0005532763 python3.9[226852]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:01:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:03 np0005532763 python3.9[227005]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 16:01:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:01:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:03.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:01:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:04.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:04 np0005532763 python3.9[227158]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 16:01:04 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 16:01:04 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3947 writes, 21K keys, 3947 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.05 MB/s#012Cumulative WAL: 3947 writes, 3947 syncs, 1.00 writes per sync, written: 0.05 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1458 writes, 6857 keys, 1458 commit groups, 1.0 writes per commit group, ingest: 16.40 MB, 0.03 MB/s#012Interval WAL: 1458 writes, 1458 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    153.6      0.22              0.13        10    0.022       0      0       0.0       0.0#012  L6      1/0   12.76 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5    214.1    182.0      0.64              0.39         9    0.071     43K   4826       0.0       0.0#012 Sum      1/0   12.76 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5    160.0    174.8      0.86              0.52        19    0.045     43K   4826       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.5    187.1    186.9      0.35              0.21         8    0.043     22K   2563       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    214.1    182.0      0.64              0.39         9    0.071     43K   4826       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    155.3      0.21              0.13         9    0.024       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.033, interval 0.012#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.15 GB write, 0.12 MB/s write, 0.13 GB read, 0.11 MB/s read, 0.9 seconds#012Interval compaction: 0.06 GB write, 0.11 MB/s write, 0.06 GB read, 0.11 MB/s read, 0.3 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e7d0d09350#2 capacity: 304.00 MB usage: 8.33 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000128 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(468,7.96 MB,2.61751%) FilterBlock(19,130.80 KB,0.0420169%) IndexBlock(19,251.02 KB,0.0806357%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Nov 23 16:01:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:01:04 np0005532763 python3.9[227281]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1763931663.472947-3757-73484953818486/.source _original_basename=._7df2re8 follow=False checksum=1a34a795567ae8875c971381dc92b5291049196d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 23 16:01:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:05.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:05 np0005532763 python3.9[227435]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 16:01:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:01:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:06.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:01:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:06 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:01:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:06 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:01:06 np0005532763 python3.9[227587]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 16:01:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:07 np0005532763 podman[227660]: 2025-11-23 21:01:07.279808984 +0000 UTC m=+0.149609159 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 23 16:01:07 np0005532763 python3.9[227733]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763931666.1746855-3834-129848138791081/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=4c77b2c041a7564aa2c84115117dc8517e9bb9ef backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:01:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000056s ======
Nov 23 16:01:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:07.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Nov 23 16:01:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:08.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:08 np0005532763 python3.9[227886]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 23 16:01:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:08 np0005532763 python3.9[228008]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763931667.681958-3881-16673324529492/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=941d5739094d046b86479403aeaaf0441b82ba11 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 23 16:01:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:09.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:01:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:10.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:10 np0005532763 python3.9[228161]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 23 16:01:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:10 np0005532763 podman[228286]: 2025-11-23 21:01:10.998591187 +0000 UTC m=+0.087037686 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 23 16:01:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:11 np0005532763 python3.9[228334]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 16:01:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:01:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:11.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:01:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:12.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:12 np0005532763 python3[228488]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:12 : epoch 6923760c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 16:01:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:13.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:13 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2e8000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:13 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:13 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:14.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:01:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:01:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:15.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:01:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210115 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 16:01:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:15 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:15 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c8000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:15 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:16.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:01:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:17.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:01:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:17 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:17 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:17 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:18.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:19.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:19 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:01:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:19 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:19 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:20.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:21 np0005532763 podman[228501]: 2025-11-23 21:01:21.536967381 +0000 UTC m=+9.141012795 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076
Nov 23 16:01:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:01:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:21.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:01:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:21 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:21 np0005532763 podman[228604]: 2025-11-23 21:01:21.749508873 +0000 UTC m=+0.075155985 container create 15266185c9ff8245ec5d84440b8c1ecae68bec690321512e4acfabde6c956cdd (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute_init, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 16:01:21 np0005532763 podman[228604]: 2025-11-23 21:01:21.707496722 +0000 UTC m=+0.033143894 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076
Nov 23 16:01:21 np0005532763 python3[228488]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076 bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 23 16:01:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:21 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:21 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:22.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:01:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:23.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:01:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:23 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:23 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:23 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:24.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:01:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:01:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:25.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:01:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:25 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:25 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:25 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c8002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:01:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:26.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:01:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:27.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:27 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:27 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:27 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:01:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:28.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:01:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:29.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:29 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c8002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:01:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:29 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:29 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:01:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:30.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:01:30 np0005532763 podman[228698]: 2025-11-23 21:01:30.234260284 +0000 UTC m=+0.099762050 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 23 16:01:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:01:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:31.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:01:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:31 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:31 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c8002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:31 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:32.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:33.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:33 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:33 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:33 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:34.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:01:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:35.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:35 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:35 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:35 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:36.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:01:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:37.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:01:37 np0005532763 podman[228823]: 2025-11-23 21:01:37.631341134 +0000 UTC m=+0.164259667 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 16:01:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:37 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:37 np0005532763 python3.9[228872]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 16:01:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210137 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 16:01:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:37 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:38 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:38.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:39 np0005532763 python3.9[229034]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 23 16:01:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:39.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 16:01:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 6101 writes, 25K keys, 6101 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 6101 writes, 1158 syncs, 5.27 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 516 writes, 817 keys, 516 commit groups, 1.0 writes per commit group, ingest: 0.27 MB, 0.00 MB/s#012Interval WAL: 516 writes, 256 syncs, 2.02 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557d78bc9350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557d78bc9350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Nov 23 16:01:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:39 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:01:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:39 np0005532763 python3.9[229187]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 23 16:01:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:39 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:40 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:40.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:41 np0005532763 python3[229348]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 23 16:01:41 np0005532763 podman[229402]: 2025-11-23 21:01:41.1793734 +0000 UTC m=+0.068252902 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 16:01:41 np0005532763 podman[229464]: 2025-11-23 21:01:41.311295126 +0000 UTC m=+0.056811994 container create f3c0cc402a22f5be8671accde6980d0d76ac2bdc1ca3540b965dffc485d1be66 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, tcib_managed=true, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Nov 23 16:01:41 np0005532763 podman[229464]: 2025-11-23 21:01:41.281493165 +0000 UTC m=+0.027010013 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076
Nov 23 16:01:41 np0005532763 python3[229348]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076 kolla_start
Nov 23 16:01:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:41.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:41 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:41 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:42 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:42.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:42 np0005532763 python3.9[229763]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 16:01:42 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:01:42 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:01:42 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:01:42 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:01:42 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:01:42 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:01:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:43 np0005532763 python3.9[229918]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:01:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:01:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:43.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:01:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:43 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:44 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:44 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:44 np0005532763 python3.9[230070]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763931703.3741853-4156-46890923372650/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 23 16:01:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:01:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:44.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:01:44 np0005532763 python3.9[230146]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 23 16:01:44 np0005532763 systemd[1]: Reloading.
Nov 23 16:01:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:01:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:44 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 16:01:44 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 16:01:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:45.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:45 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:45 np0005532763 python3.9[230259]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 23 16:01:45 np0005532763 systemd[1]: Reloading.
Nov 23 16:01:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:46 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:46 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2b4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:46 np0005532763 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 23 16:01:46 np0005532763 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 23 16:01:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:46.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:46 np0005532763 systemd[1]: Starting nova_compute container...
Nov 23 16:01:46 np0005532763 systemd[1]: Started libcrun container.
Nov 23 16:01:46 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff1899460981cfa6c12cb6f79592d054e2fa693cee5abef6995622ed1beea179/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 23 16:01:46 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff1899460981cfa6c12cb6f79592d054e2fa693cee5abef6995622ed1beea179/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 23 16:01:46 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff1899460981cfa6c12cb6f79592d054e2fa693cee5abef6995622ed1beea179/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 23 16:01:46 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff1899460981cfa6c12cb6f79592d054e2fa693cee5abef6995622ed1beea179/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 16:01:46 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff1899460981cfa6c12cb6f79592d054e2fa693cee5abef6995622ed1beea179/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 16:01:46 np0005532763 podman[230301]: 2025-11-23 21:01:46.44792214 +0000 UTC m=+0.143226831 container init f3c0cc402a22f5be8671accde6980d0d76ac2bdc1ca3540b965dffc485d1be66 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:01:46 np0005532763 podman[230301]: 2025-11-23 21:01:46.468454582 +0000 UTC m=+0.163759213 container start f3c0cc402a22f5be8671accde6980d0d76ac2bdc1ca3540b965dffc485d1be66 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:01:46 np0005532763 podman[230301]: nova_compute
Nov 23 16:01:46 np0005532763 nova_compute[230316]: + sudo -E kolla_set_configs
Nov 23 16:01:46 np0005532763 systemd[1]: Started nova_compute container.
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Validating config file
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Copying service configuration files
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Deleting /etc/ceph
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Creating directory /etc/ceph
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /etc/ceph
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Writing out command to execute
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 23 16:01:46 np0005532763 nova_compute[230316]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 23 16:01:46 np0005532763 nova_compute[230316]: ++ cat /run_command
Nov 23 16:01:46 np0005532763 nova_compute[230316]: + CMD=nova-compute
Nov 23 16:01:46 np0005532763 nova_compute[230316]: + ARGS=
Nov 23 16:01:46 np0005532763 nova_compute[230316]: + sudo kolla_copy_cacerts
Nov 23 16:01:46 np0005532763 nova_compute[230316]: + [[ ! -n '' ]]
Nov 23 16:01:46 np0005532763 nova_compute[230316]: + . kolla_extend_start
Nov 23 16:01:46 np0005532763 nova_compute[230316]: + echo 'Running command: '\''nova-compute'\'''
Nov 23 16:01:46 np0005532763 nova_compute[230316]: Running command: 'nova-compute'
Nov 23 16:01:46 np0005532763 nova_compute[230316]: + umask 0022
Nov 23 16:01:46 np0005532763 nova_compute[230316]: + exec nova-compute
Nov 23 16:01:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:47.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:47 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:47 np0005532763 python3.9[230479]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 16:01:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:48 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:48 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:01:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:48.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:01:48 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:01:48 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:01:48 np0005532763 nova_compute[230316]: 2025-11-23 21:01:48.811 230320 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 23 16:01:48 np0005532763 nova_compute[230316]: 2025-11-23 21:01:48.812 230320 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 23 16:01:48 np0005532763 nova_compute[230316]: 2025-11-23 21:01:48.812 230320 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 23 16:01:48 np0005532763 nova_compute[230316]: 2025-11-23 21:01:48.812 230320 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 23 16:01:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:48 np0005532763 python3.9[230656]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 16:01:48 np0005532763 nova_compute[230316]: 2025-11-23 21:01:48.945 230320 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:01:48 np0005532763 nova_compute[230316]: 2025-11-23 21:01:48.974 230320 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:01:48 np0005532763 nova_compute[230316]: 2025-11-23 21:01:48.974 230320 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 23 16:01:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.393 230320 INFO nova.virt.driver [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.578 230320 INFO nova.compute.provider_config [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 23 16:01:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:49.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.640 230320 DEBUG oslo_concurrency.lockutils [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.640 230320 DEBUG oslo_concurrency.lockutils [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.641 230320 DEBUG oslo_concurrency.lockutils [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.641 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.641 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.641 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.642 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.642 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.642 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.642 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.642 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.642 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.642 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.643 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.643 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.643 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.643 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.643 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.643 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.644 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.644 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.644 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.644 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.644 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.644 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.644 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.644 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.645 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.645 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.645 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.645 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.645 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.646 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.646 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.646 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.646 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.646 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.646 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.646 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.647 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.647 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.647 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.647 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.647 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.647 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.647 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.648 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.648 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.648 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.648 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.648 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.648 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.648 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.649 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.649 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.649 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.649 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.649 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.649 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.649 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.650 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.650 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.650 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.650 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.650 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.650 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.651 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.651 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.651 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.651 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.651 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.651 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.652 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.652 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.652 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.652 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.652 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.652 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.652 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.653 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.653 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.653 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.653 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.653 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.653 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.653 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.654 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.654 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.654 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.654 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.654 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.654 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.654 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.654 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.655 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.655 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.655 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.655 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.655 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.655 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.655 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.656 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.656 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.656 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.656 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.656 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.656 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.657 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.657 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.657 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.657 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.657 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.657 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.658 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.658 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.658 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.658 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.658 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.658 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.658 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.659 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.659 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.659 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.659 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.659 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.659 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.659 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.660 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.660 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.660 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.660 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.660 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.660 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.660 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.661 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.661 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.661 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.661 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.661 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.661 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.661 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.662 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.662 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.662 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.662 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.662 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.662 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.662 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.663 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.663 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.663 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.663 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.663 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.663 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.664 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.664 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.664 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.664 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.664 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.664 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.664 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.665 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.665 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.665 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.665 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.665 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.665 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.665 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.666 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.666 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.666 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.666 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.666 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.666 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.666 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.667 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.667 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.667 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.667 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.667 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.667 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.668 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.668 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.668 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.668 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.668 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.668 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.668 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.669 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.669 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.669 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.669 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.669 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.669 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.669 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.670 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.670 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.670 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.670 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.670 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.670 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.670 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.671 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.671 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.671 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.671 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.671 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.671 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.671 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.672 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.672 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.672 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.672 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.672 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.672 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.672 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.673 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.673 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.673 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.673 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.673 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.673 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.673 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.674 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.674 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.674 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.674 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.674 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.674 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.674 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.674 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.675 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.675 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.675 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.675 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.675 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.675 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.676 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.676 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.676 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.676 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.676 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.676 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.676 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.677 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.677 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.677 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.677 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.677 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.677 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.677 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.677 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.678 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.678 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.678 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.678 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.678 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.678 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.678 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.679 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.679 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.679 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.679 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.679 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.679 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.679 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.680 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.680 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.680 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.680 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.680 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.680 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.681 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.681 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.681 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.681 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.681 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.681 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.681 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.682 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.682 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.682 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.682 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.682 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.682 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.682 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.682 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.683 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.683 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.683 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.683 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.683 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.683 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.683 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.684 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.684 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.684 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.684 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.684 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.684 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.684 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.684 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.685 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.685 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.685 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.685 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.685 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.685 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.685 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.686 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.686 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.686 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.686 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.686 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.686 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.686 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.687 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.687 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.687 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.687 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.687 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.687 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.687 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.687 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.688 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.688 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.688 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.688 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.688 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.688 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.688 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.689 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.689 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.689 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.689 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.689 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.689 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.689 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.690 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.690 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.690 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.690 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.690 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.690 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.690 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.691 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.691 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.691 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.691 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.691 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.691 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.691 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.691 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.692 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.692 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.692 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.692 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.692 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.692 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.693 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.693 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.693 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.693 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.693 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.693 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.693 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.694 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.694 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.694 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.694 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.694 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.694 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.694 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.694 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.695 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.695 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.695 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.695 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.695 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.695 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.695 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.696 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.696 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.696 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.696 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.696 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.696 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.696 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.696 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.697 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.697 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.697 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.697 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.697 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.697 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.697 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.698 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.698 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.698 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.698 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.698 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.698 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.698 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.699 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.699 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.699 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.699 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.699 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.699 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.699 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.699 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.700 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.700 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.700 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.700 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.700 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.700 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.700 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.701 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.701 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.701 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.701 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.701 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.701 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.701 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.701 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.702 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.702 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.702 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.702 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.702 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.702 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.702 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.703 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.703 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.703 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.703 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.703 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.703 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.703 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.704 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.704 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.704 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.704 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.704 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.704 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.704 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.704 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.705 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.705 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.705 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.705 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.705 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.705 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.705 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.706 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.706 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.706 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.706 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.706 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.706 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.706 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.707 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.707 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.707 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.707 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.707 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.707 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.707 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.708 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.708 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.708 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.708 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.708 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.708 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.708 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.709 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.709 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.709 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.709 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.709 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.709 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.709 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.709 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.710 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.710 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.710 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.710 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.710 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.710 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.710 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.711 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.711 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.711 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.711 230320 WARNING oslo_config.cfg [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 23 16:01:49 np0005532763 nova_compute[230316]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 23 16:01:49 np0005532763 nova_compute[230316]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 23 16:01:49 np0005532763 nova_compute[230316]: and ``live_migration_inbound_addr`` respectively.
Nov 23 16:01:49 np0005532763 nova_compute[230316]: ).  Its value may be silently ignored in the future.#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.711 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.712 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.712 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.712 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.712 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.712 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.712 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.713 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.713 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.713 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.713 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.713 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.713 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.713 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.713 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.714 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.714 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.714 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.714 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.rbd_secret_uuid        = 03808be8-ae4a-5548-82e6-4a294f1bc627 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.714 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.714 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.714 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.715 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.715 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.715 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.715 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.715 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.715 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.715 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.716 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.716 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.716 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.716 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.716 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.716 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.716 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.717 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.717 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.717 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.717 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.717 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.717 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.717 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.717 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.718 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.718 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.718 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.718 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.718 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.718 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.718 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.719 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.719 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.719 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.719 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.719 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.719 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.719 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.720 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.720 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.720 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.720 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.720 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.720 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.720 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.720 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.721 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.721 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.721 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.721 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.721 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.721 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.721 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.722 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.722 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.722 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.722 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.722 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.722 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.722 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.722 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.723 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.723 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.723 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.723 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.723 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.723 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.723 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.724 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.724 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.724 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.724 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.724 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.724 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.724 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.725 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.725 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.725 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.725 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.725 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.725 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.725 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.725 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.726 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.726 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.726 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.726 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.726 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.726 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.726 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.727 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.727 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.727 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.727 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.727 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.727 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.727 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.728 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.728 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.728 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.728 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.728 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.728 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.728 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.729 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.729 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.729 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.729 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.729 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.729 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.729 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.729 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.730 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.730 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.730 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.730 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.730 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.730 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.730 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.731 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.731 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.731 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.731 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.731 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.731 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.732 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.732 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.732 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.732 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.732 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.732 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.732 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.732 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.733 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.733 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.733 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.733 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.733 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.733 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.733 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.734 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.734 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.734 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.734 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.734 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.734 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.735 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.735 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.735 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.735 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.735 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.735 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.735 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.736 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.736 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.736 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.736 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.736 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.736 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.736 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.737 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.737 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.737 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.737 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.737 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.737 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.737 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.738 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.738 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.738 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.738 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.738 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.738 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.738 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.739 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.739 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.739 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.739 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.739 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.739 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.740 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.740 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.740 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.740 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.740 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.740 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.740 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.741 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.741 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.741 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.741 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.741 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.741 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.741 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.741 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.742 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.742 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.742 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.742 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.742 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.742 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.742 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.743 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.743 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.743 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.743 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.743 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.743 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.743 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.743 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.744 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.744 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.744 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.744 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.744 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.744 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.744 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.745 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.745 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.745 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.745 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.745 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.745 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.745 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.745 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.746 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.746 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.746 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.746 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.746 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.746 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.747 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.747 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.747 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.747 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.747 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.747 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.748 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.748 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.748 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.748 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.748 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.748 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.748 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.748 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.749 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.749 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.749 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.749 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.749 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.749 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.749 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.750 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.750 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.750 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.750 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.750 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.750 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.750 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.751 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.751 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.751 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.751 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.751 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.751 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.751 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.752 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.752 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.752 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.752 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.752 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.752 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.752 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.753 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.753 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.753 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.753 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.753 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.753 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.753 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.754 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:49 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2b40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.754 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.754 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.754 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.754 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.754 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.754 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.754 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.755 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.755 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.755 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.755 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.755 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.755 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.755 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.756 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.756 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.756 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.756 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.756 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.756 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.756 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.756 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.757 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.757 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.757 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.757 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.757 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.757 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.757 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.758 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.758 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.758 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.758 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.758 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.758 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.758 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.759 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.759 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.759 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.759 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.759 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.759 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.760 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.760 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.760 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.760 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.760 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.760 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.760 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.760 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.761 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.761 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.761 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.761 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.761 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.761 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.761 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.762 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.762 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.762 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.762 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.762 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.762 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.762 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.763 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.763 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.763 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.763 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.763 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.763 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.763 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.764 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.764 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.764 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.764 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.764 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.764 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.764 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.764 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.765 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.765 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.765 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.765 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.765 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.765 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.765 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.766 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.766 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.766 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.766 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.766 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.766 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.767 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.767 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.767 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.767 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.767 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.768 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.768 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.768 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.768 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.768 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.768 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.769 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.769 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.769 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.769 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.769 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.769 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.769 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.769 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.770 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.770 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.770 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.770 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.770 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.770 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.770 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.771 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.771 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.771 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.771 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.771 230320 DEBUG oslo_service.service [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.772 230320 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 23 16:01:49 np0005532763 python3.9[230810]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 23 16:01:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.840 230320 DEBUG nova.virt.libvirt.host [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.841 230320 DEBUG nova.virt.libvirt.host [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.841 230320 DEBUG nova.virt.libvirt.host [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.841 230320 DEBUG nova.virt.libvirt.host [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 23 16:01:49 np0005532763 systemd[1]: Starting libvirt QEMU daemon...
Nov 23 16:01:49 np0005532763 systemd[1]: Started libvirt QEMU daemon.
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.931 230320 DEBUG nova.virt.libvirt.host [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f81be7bebb0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.934 230320 DEBUG nova.virt.libvirt.host [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f81be7bebb0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.936 230320 INFO nova.virt.libvirt.driver [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.948 230320 WARNING nova.virt.libvirt.driver [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Nov 23 16:01:49 np0005532763 nova_compute[230316]: 2025-11-23 21:01:49.948 230320 DEBUG nova.virt.libvirt.volume.mount [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 23 16:01:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:50 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:50 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:50.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:50 : epoch 6923760c : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:01:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:50 np0005532763 python3.9[231024]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 23 16:01:50 np0005532763 nova_compute[230316]: 2025-11-23 21:01:50.956 230320 INFO nova.virt.libvirt.host [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Libvirt host capabilities <capabilities>
Nov 23 16:01:50 np0005532763 nova_compute[230316]: 
Nov 23 16:01:50 np0005532763 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 16:01:50 np0005532763 nova_compute[230316]:  <host>
Nov 23 16:01:50 np0005532763 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    <uuid>e38e4d8b-cfb8-4d24-8752-3d68cd15bb48</uuid>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    <cpu>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <arch>x86_64</arch>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <model>EPYC-Rome-v4</model>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <vendor>AMD</vendor>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <microcode version='16777317'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <signature family='23' model='49' stepping='0'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='x2apic'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='tsc-deadline'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='osxsave'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='hypervisor'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='tsc_adjust'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='spec-ctrl'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='stibp'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='arch-capabilities'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='ssbd'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='cmp_legacy'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='topoext'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='virt-ssbd'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='lbrv'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='tsc-scale'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='vmcb-clean'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='pause-filter'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='pfthreshold'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='svme-addr-chk'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='rdctl-no'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='skip-l1dfl-vmentry'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='mds-no'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <feature name='pschange-mc-no'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <pages unit='KiB' size='4'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <pages unit='KiB' size='2048'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <pages unit='KiB' size='1048576'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    </cpu>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    <power_management>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <suspend_mem/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    </power_management>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    <iommu support='no'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    <migration_features>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <live/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <uri_transports>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:        <uri_transport>tcp</uri_transport>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:        <uri_transport>rdma</uri_transport>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      </uri_transports>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    </migration_features>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    <topology>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <cells num='1'>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:        <cell id='0'>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:          <memory unit='KiB'>7864320</memory>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:          <pages unit='KiB' size='4'>1966080</pages>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:          <pages unit='KiB' size='2048'>0</pages>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:          <distances>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:            <sibling id='0' value='10'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:          </distances>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:          <cpus num='8'>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:          </cpus>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:        </cell>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      </cells>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    </topology>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    <cache>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    </cache>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    <secmodel>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <model>selinux</model>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <doi>0</doi>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    </secmodel>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    <secmodel>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <model>dac</model>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <doi>0</doi>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    </secmodel>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:  </host>
Nov 23 16:01:50 np0005532763 nova_compute[230316]: 
Nov 23 16:01:50 np0005532763 nova_compute[230316]:  <guest>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    <os_type>hvm</os_type>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    <arch name='i686'>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <wordsize>32</wordsize>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <domain type='qemu'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <domain type='kvm'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    </arch>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    <features>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <pae/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <nonpae/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <acpi default='on' toggle='yes'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <apic default='on' toggle='no'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <cpuselection/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <deviceboot/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <disksnapshot default='on' toggle='no'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <externalSnapshot/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    </features>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:  </guest>
Nov 23 16:01:50 np0005532763 nova_compute[230316]: 
Nov 23 16:01:50 np0005532763 nova_compute[230316]:  <guest>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    <os_type>hvm</os_type>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    <arch name='x86_64'>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <wordsize>64</wordsize>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <domain type='qemu'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <domain type='kvm'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    </arch>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    <features>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <acpi default='on' toggle='yes'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <apic default='on' toggle='no'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <cpuselection/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <deviceboot/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <disksnapshot default='on' toggle='no'/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:      <externalSnapshot/>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:    </features>
Nov 23 16:01:50 np0005532763 nova_compute[230316]:  </guest>
Nov 23 16:01:50 np0005532763 nova_compute[230316]: 
Nov 23 16:01:50 np0005532763 nova_compute[230316]: </capabilities>
Nov 23 16:01:50 np0005532763 nova_compute[230316]: #033[00m
Nov 23 16:01:50 np0005532763 nova_compute[230316]: 2025-11-23 21:01:50.964 230320 DEBUG nova.virt.libvirt.host [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:50.998 230320 DEBUG nova.virt.libvirt.host [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 23 16:01:51 np0005532763 nova_compute[230316]: <domainCapabilities>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <path>/usr/libexec/qemu-kvm</path>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <domain>kvm</domain>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <arch>i686</arch>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <vcpu max='4096'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <iothreads supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <os supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <enum name='firmware'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <loader supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>rom</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pflash</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='readonly'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>yes</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>no</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='secure'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>no</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </loader>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </os>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <cpu>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <mode name='host-passthrough' supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='hostPassthroughMigratable'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>on</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>off</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </mode>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <mode name='maximum' supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='maximumMigratable'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>on</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>off</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </mode>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <mode name='host-model' supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <vendor>AMD</vendor>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='x2apic'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='tsc-deadline'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='hypervisor'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='tsc_adjust'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='spec-ctrl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='stibp'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='ssbd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='cmp_legacy'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='overflow-recov'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='succor'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='ibrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='amd-ssbd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='virt-ssbd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='lbrv'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='tsc-scale'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='vmcb-clean'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='flushbyasid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='pause-filter'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='pfthreshold'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='svme-addr-chk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='disable' name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </mode>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <mode name='custom' supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-noTSX'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v5'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cooperlake'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cooperlake-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cooperlake-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Denverton'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mpx'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Denverton-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mpx'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Denverton-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Denverton-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Dhyana-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Genoa'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amd-psfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='auto-ibrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='stibp-always-on'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Genoa-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amd-psfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='auto-ibrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='stibp-always-on'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Milan'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Milan-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Milan-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amd-psfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='stibp-always-on'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Rome'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Rome-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Rome-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Rome-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='GraniteRapids'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='prefetchiti'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='GraniteRapids-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='prefetchiti'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='GraniteRapids-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx10'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx10-128'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx10-256'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx10-512'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='prefetchiti'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-noTSX'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-noTSX'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v5'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v6'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v7'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='IvyBridge'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='IvyBridge-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='IvyBridge-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='IvyBridge-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='KnightsMill'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-4fmaps'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-4vnniw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512er'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512pf'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='KnightsMill-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-4fmaps'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-4vnniw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512er'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512pf'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Opteron_G4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fma4'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xop'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Opteron_G4-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fma4'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xop'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Opteron_G5'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fma4'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tbm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xop'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Opteron_G5-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fma4'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tbm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xop'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SapphireRapids'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SapphireRapids-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SapphireRapids-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SapphireRapids-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SierraForest'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-ne-convert'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cmpccxadd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SierraForest-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-ne-convert'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cmpccxadd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v5'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='core-capability'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mpx'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='split-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='core-capability'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mpx'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='split-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='core-capability'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='split-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='core-capability'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='split-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='athlon'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnow'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnowext'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='athlon-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnow'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnowext'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='core2duo'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='core2duo-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='coreduo'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='coreduo-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='n270'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='n270-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='phenom'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnow'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnowext'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='phenom-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnow'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnowext'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </mode>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </cpu>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <memoryBacking supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <enum name='sourceType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>file</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>anonymous</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>memfd</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </memoryBacking>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <devices>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <disk supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='diskDevice'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>disk</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>cdrom</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>floppy</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>lun</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='bus'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>fdc</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>scsi</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>usb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>sata</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio-transitional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio-non-transitional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </disk>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <graphics supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vnc</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>egl-headless</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>dbus</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </graphics>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <video supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='modelType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vga</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>cirrus</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>none</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>bochs</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>ramfb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </video>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <hostdev supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='mode'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>subsystem</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='startupPolicy'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>default</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>mandatory</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>requisite</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>optional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='subsysType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>usb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pci</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>scsi</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='capsType'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='pciBackend'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </hostdev>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <rng supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio-transitional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio-non-transitional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendModel'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>random</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>egd</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>builtin</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </rng>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <filesystem supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='driverType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>path</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>handle</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtiofs</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </filesystem>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <tpm supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tpm-tis</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tpm-crb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendModel'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>emulator</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>external</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendVersion'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>2.0</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </tpm>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <redirdev supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='bus'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>usb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </redirdev>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <channel supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pty</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>unix</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </channel>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <crypto supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>qemu</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendModel'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>builtin</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </crypto>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <interface supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>default</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>passt</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </interface>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <panic supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>isa</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>hyperv</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </panic>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <console supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>null</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vc</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pty</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>dev</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>file</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pipe</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>stdio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>udp</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tcp</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>unix</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>qemu-vdagent</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>dbus</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </console>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </devices>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <features>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <gic supported='no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <vmcoreinfo supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <genid supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <backingStoreInput supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <backup supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <async-teardown supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <ps2 supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <sev supported='no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <sgx supported='no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <hyperv supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='features'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>relaxed</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vapic</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>spinlocks</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vpindex</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>runtime</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>synic</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>stimer</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>reset</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vendor_id</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>frequencies</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>reenlightenment</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tlbflush</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>ipi</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>avic</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>emsr_bitmap</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>xmm_input</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <defaults>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <spinlocks>4095</spinlocks>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <stimer_direct>on</stimer_direct>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <tlbflush_direct>on</tlbflush_direct>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <tlbflush_extended>on</tlbflush_extended>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </defaults>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </hyperv>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <launchSecurity supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='sectype'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tdx</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </launchSecurity>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </features>
Nov 23 16:01:51 np0005532763 nova_compute[230316]: </domainCapabilities>
Nov 23 16:01:51 np0005532763 nova_compute[230316]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.004 230320 DEBUG nova.virt.libvirt.host [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 23 16:01:51 np0005532763 nova_compute[230316]: <domainCapabilities>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <path>/usr/libexec/qemu-kvm</path>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <domain>kvm</domain>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <arch>i686</arch>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <vcpu max='240'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <iothreads supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <os supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <enum name='firmware'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <loader supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>rom</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pflash</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='readonly'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>yes</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>no</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='secure'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>no</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </loader>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </os>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <cpu>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <mode name='host-passthrough' supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='hostPassthroughMigratable'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>on</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>off</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </mode>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <mode name='maximum' supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='maximumMigratable'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>on</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>off</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </mode>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <mode name='host-model' supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <vendor>AMD</vendor>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='x2apic'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='tsc-deadline'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='hypervisor'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='tsc_adjust'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='spec-ctrl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='stibp'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='ssbd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='cmp_legacy'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='overflow-recov'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='succor'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='ibrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='amd-ssbd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='virt-ssbd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='lbrv'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='tsc-scale'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='vmcb-clean'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='flushbyasid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='pause-filter'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='pfthreshold'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='svme-addr-chk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='disable' name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </mode>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <mode name='custom' supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-noTSX'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v5'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cooperlake'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cooperlake-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cooperlake-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Denverton'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mpx'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Denverton-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mpx'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Denverton-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Denverton-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Dhyana-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Genoa'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amd-psfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='auto-ibrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='stibp-always-on'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Genoa-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amd-psfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='auto-ibrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='stibp-always-on'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Milan'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Milan-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Milan-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amd-psfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='stibp-always-on'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Rome'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Rome-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Rome-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Rome-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='GraniteRapids'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='prefetchiti'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='GraniteRapids-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='prefetchiti'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='GraniteRapids-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx10'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx10-128'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx10-256'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx10-512'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='prefetchiti'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-noTSX'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-noTSX'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v5'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v6'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v7'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='IvyBridge'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='IvyBridge-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='IvyBridge-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='IvyBridge-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='KnightsMill'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-4fmaps'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-4vnniw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512er'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512pf'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='KnightsMill-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-4fmaps'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-4vnniw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512er'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512pf'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Opteron_G4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fma4'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xop'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Opteron_G4-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fma4'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xop'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Opteron_G5'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fma4'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tbm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xop'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Opteron_G5-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fma4'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tbm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xop'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SapphireRapids'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SapphireRapids-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SapphireRapids-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SapphireRapids-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SierraForest'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-ne-convert'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cmpccxadd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SierraForest-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-ne-convert'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cmpccxadd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v5'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='core-capability'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mpx'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='split-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='core-capability'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mpx'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='split-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='core-capability'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='split-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='core-capability'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='split-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='athlon'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnow'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnowext'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='athlon-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnow'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnowext'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='core2duo'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='core2duo-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='coreduo'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='coreduo-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='n270'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='n270-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='phenom'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnow'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnowext'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='phenom-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnow'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnowext'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </mode>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </cpu>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <memoryBacking supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <enum name='sourceType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>file</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>anonymous</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>memfd</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </memoryBacking>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <devices>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <disk supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='diskDevice'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>disk</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>cdrom</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>floppy</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>lun</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='bus'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>ide</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>fdc</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>scsi</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>usb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>sata</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio-transitional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio-non-transitional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </disk>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <graphics supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vnc</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>egl-headless</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>dbus</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </graphics>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <video supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='modelType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vga</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>cirrus</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>none</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>bochs</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>ramfb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </video>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <hostdev supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='mode'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>subsystem</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='startupPolicy'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>default</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>mandatory</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>requisite</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>optional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='subsysType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>usb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pci</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>scsi</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='capsType'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='pciBackend'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </hostdev>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <rng supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio-transitional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio-non-transitional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendModel'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>random</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>egd</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>builtin</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </rng>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <filesystem supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='driverType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>path</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>handle</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtiofs</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </filesystem>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <tpm supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tpm-tis</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tpm-crb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendModel'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>emulator</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>external</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendVersion'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>2.0</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </tpm>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <redirdev supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='bus'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>usb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </redirdev>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <channel supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pty</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>unix</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </channel>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <crypto supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>qemu</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendModel'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>builtin</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </crypto>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <interface supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>default</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>passt</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </interface>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <panic supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>isa</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>hyperv</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </panic>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <console supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>null</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vc</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pty</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>dev</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>file</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pipe</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>stdio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>udp</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tcp</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>unix</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>qemu-vdagent</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>dbus</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </console>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </devices>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <features>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <gic supported='no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <vmcoreinfo supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <genid supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <backingStoreInput supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <backup supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <async-teardown supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <ps2 supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <sev supported='no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <sgx supported='no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <hyperv supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='features'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>relaxed</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vapic</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>spinlocks</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vpindex</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>runtime</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>synic</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>stimer</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>reset</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vendor_id</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>frequencies</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>reenlightenment</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tlbflush</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>ipi</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>avic</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>emsr_bitmap</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>xmm_input</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <defaults>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <spinlocks>4095</spinlocks>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <stimer_direct>on</stimer_direct>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <tlbflush_direct>on</tlbflush_direct>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <tlbflush_extended>on</tlbflush_extended>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </defaults>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </hyperv>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <launchSecurity supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='sectype'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tdx</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </launchSecurity>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </features>
Nov 23 16:01:51 np0005532763 nova_compute[230316]: </domainCapabilities>
Nov 23 16:01:51 np0005532763 nova_compute[230316]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.038 230320 DEBUG nova.virt.libvirt.host [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.042 230320 DEBUG nova.virt.libvirt.host [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 23 16:01:51 np0005532763 nova_compute[230316]: <domainCapabilities>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <path>/usr/libexec/qemu-kvm</path>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <domain>kvm</domain>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <arch>x86_64</arch>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <vcpu max='4096'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <iothreads supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <os supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <enum name='firmware'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>efi</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <loader supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>rom</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pflash</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='readonly'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>yes</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>no</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='secure'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>yes</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>no</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </loader>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </os>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <cpu>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <mode name='host-passthrough' supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='hostPassthroughMigratable'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>on</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>off</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </mode>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <mode name='maximum' supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='maximumMigratable'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>on</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>off</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </mode>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <mode name='host-model' supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <vendor>AMD</vendor>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='x2apic'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='tsc-deadline'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='hypervisor'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='tsc_adjust'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='spec-ctrl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='stibp'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='ssbd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='cmp_legacy'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='overflow-recov'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='succor'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='ibrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='amd-ssbd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='virt-ssbd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='lbrv'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='tsc-scale'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='vmcb-clean'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='flushbyasid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='pause-filter'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='pfthreshold'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='svme-addr-chk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='disable' name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </mode>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <mode name='custom' supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-noTSX'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v5'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cooperlake'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cooperlake-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cooperlake-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Denverton'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mpx'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Denverton-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mpx'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Denverton-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Denverton-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Dhyana-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Genoa'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amd-psfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='auto-ibrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='stibp-always-on'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Genoa-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amd-psfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='auto-ibrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='stibp-always-on'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Milan'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Milan-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Milan-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amd-psfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='stibp-always-on'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Rome'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Rome-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Rome-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Rome-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='GraniteRapids'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='prefetchiti'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='GraniteRapids-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='prefetchiti'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='GraniteRapids-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx10'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx10-128'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx10-256'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx10-512'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='prefetchiti'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-noTSX'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-noTSX'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v5'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v6'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v7'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='IvyBridge'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='IvyBridge-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='IvyBridge-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='IvyBridge-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='KnightsMill'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-4fmaps'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-4vnniw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512er'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512pf'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='KnightsMill-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-4fmaps'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-4vnniw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512er'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512pf'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Opteron_G4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fma4'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xop'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Opteron_G4-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fma4'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xop'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Opteron_G5'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fma4'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tbm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xop'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Opteron_G5-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fma4'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tbm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xop'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SapphireRapids'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SapphireRapids-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SapphireRapids-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SapphireRapids-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SierraForest'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-ne-convert'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cmpccxadd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SierraForest-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-ne-convert'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cmpccxadd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v5'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='core-capability'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mpx'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='split-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='core-capability'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mpx'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='split-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='core-capability'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='split-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='core-capability'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='split-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='athlon'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnow'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnowext'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='athlon-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnow'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnowext'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='core2duo'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='core2duo-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='coreduo'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='coreduo-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='n270'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='n270-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='phenom'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnow'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnowext'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='phenom-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnow'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnowext'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </mode>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </cpu>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <memoryBacking supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <enum name='sourceType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>file</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>anonymous</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>memfd</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </memoryBacking>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <devices>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <disk supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='diskDevice'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>disk</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>cdrom</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>floppy</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>lun</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='bus'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>fdc</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>scsi</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>usb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>sata</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio-transitional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio-non-transitional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </disk>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <graphics supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vnc</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>egl-headless</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>dbus</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </graphics>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <video supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='modelType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vga</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>cirrus</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>none</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>bochs</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>ramfb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </video>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <hostdev supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='mode'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>subsystem</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='startupPolicy'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>default</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>mandatory</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>requisite</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>optional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='subsysType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>usb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pci</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>scsi</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='capsType'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='pciBackend'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </hostdev>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <rng supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio-transitional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio-non-transitional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendModel'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>random</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>egd</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>builtin</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </rng>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <filesystem supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='driverType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>path</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>handle</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtiofs</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </filesystem>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <tpm supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tpm-tis</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tpm-crb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendModel'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>emulator</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>external</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendVersion'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>2.0</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </tpm>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <redirdev supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='bus'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>usb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </redirdev>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <channel supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pty</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>unix</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </channel>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <crypto supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>qemu</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendModel'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>builtin</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </crypto>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <interface supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>default</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>passt</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </interface>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <panic supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>isa</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>hyperv</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </panic>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <console supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>null</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vc</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pty</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>dev</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>file</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pipe</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>stdio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>udp</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tcp</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>unix</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>qemu-vdagent</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>dbus</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </console>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </devices>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <features>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <gic supported='no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <vmcoreinfo supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <genid supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <backingStoreInput supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <backup supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <async-teardown supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <ps2 supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <sev supported='no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <sgx supported='no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <hyperv supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='features'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>relaxed</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vapic</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>spinlocks</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vpindex</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>runtime</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>synic</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>stimer</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>reset</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vendor_id</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>frequencies</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>reenlightenment</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tlbflush</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>ipi</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>avic</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>emsr_bitmap</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>xmm_input</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <defaults>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <spinlocks>4095</spinlocks>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <stimer_direct>on</stimer_direct>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <tlbflush_direct>on</tlbflush_direct>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <tlbflush_extended>on</tlbflush_extended>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </defaults>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </hyperv>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <launchSecurity supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='sectype'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tdx</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </launchSecurity>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </features>
Nov 23 16:01:51 np0005532763 nova_compute[230316]: </domainCapabilities>
Nov 23 16:01:51 np0005532763 nova_compute[230316]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.104 230320 DEBUG nova.virt.libvirt.host [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 23 16:01:51 np0005532763 nova_compute[230316]: <domainCapabilities>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <path>/usr/libexec/qemu-kvm</path>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <domain>kvm</domain>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <arch>x86_64</arch>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <vcpu max='240'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <iothreads supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <os supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <enum name='firmware'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <loader supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>rom</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pflash</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='readonly'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>yes</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>no</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='secure'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>no</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </loader>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </os>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <cpu>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <mode name='host-passthrough' supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='hostPassthroughMigratable'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>on</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>off</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </mode>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <mode name='maximum' supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='maximumMigratable'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>on</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>off</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </mode>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <mode name='host-model' supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <vendor>AMD</vendor>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='x2apic'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='tsc-deadline'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='hypervisor'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='tsc_adjust'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='spec-ctrl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='stibp'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='ssbd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='cmp_legacy'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='overflow-recov'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='succor'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='ibrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='amd-ssbd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='virt-ssbd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='lbrv'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='tsc-scale'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='vmcb-clean'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='flushbyasid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='pause-filter'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='pfthreshold'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='svme-addr-chk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <feature policy='disable' name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </mode>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <mode name='custom' supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-noTSX'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Broadwell-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cascadelake-Server-v5'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cooperlake'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cooperlake-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Cooperlake-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Denverton'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mpx'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Denverton-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mpx'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Denverton-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Denverton-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Dhyana-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Genoa'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amd-psfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='auto-ibrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='stibp-always-on'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Genoa-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amd-psfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='auto-ibrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='stibp-always-on'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Milan'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Milan-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Milan-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amd-psfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='stibp-always-on'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Rome'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Rome-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Rome-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-Rome-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='EPYC-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='GraniteRapids'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='prefetchiti'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='GraniteRapids-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='prefetchiti'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='GraniteRapids-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx10'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx10-128'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx10-256'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx10-512'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='prefetchiti'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-noTSX'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Haswell-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-noTSX'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v5'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v6'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Icelake-Server-v7'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='IvyBridge'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='IvyBridge-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='IvyBridge-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='IvyBridge-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='KnightsMill'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-4fmaps'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-4vnniw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512er'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512pf'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='KnightsMill-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-4fmaps'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-4vnniw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512er'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512pf'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Opteron_G4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fma4'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xop'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Opteron_G4-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fma4'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xop'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Opteron_G5'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fma4'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tbm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xop'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Opteron_G5-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fma4'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tbm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xop'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SapphireRapids'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SapphireRapids-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SapphireRapids-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SapphireRapids-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='amx-tile'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-bf16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-fp16'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bitalg'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrc'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fzrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='la57'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='taa-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xfd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SierraForest'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-ne-convert'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cmpccxadd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='SierraForest-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-ifma'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-ne-convert'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx-vnni-int8'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cmpccxadd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fbsdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='fsrs'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ibrs-all'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mcdt-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pbrsb-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='psdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='serialize'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vaes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Client-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='hle'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='rtm'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Skylake-Server-v5'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512bw'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512cd'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512dq'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512f'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='avx512vl'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='invpcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pcid'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='pku'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='core-capability'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mpx'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='split-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='core-capability'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='mpx'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='split-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge-v2'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='core-capability'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='split-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge-v3'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='core-capability'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='split-lock-detect'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='Snowridge-v4'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='cldemote'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='erms'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='gfni'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdir64b'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='movdiri'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='xsaves'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='athlon'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnow'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnowext'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='athlon-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnow'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnowext'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='core2duo'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='core2duo-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='coreduo'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='coreduo-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='n270'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='n270-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='ss'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='phenom'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnow'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnowext'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <blockers model='phenom-v1'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnow'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <feature name='3dnowext'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </blockers>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </mode>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </cpu>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <memoryBacking supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <enum name='sourceType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>file</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>anonymous</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <value>memfd</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </memoryBacking>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <devices>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <disk supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='diskDevice'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>disk</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>cdrom</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>floppy</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>lun</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='bus'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>ide</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>fdc</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>scsi</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>usb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>sata</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio-transitional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio-non-transitional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </disk>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <graphics supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vnc</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>egl-headless</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>dbus</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </graphics>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <video supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='modelType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vga</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>cirrus</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>none</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>bochs</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>ramfb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </video>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <hostdev supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='mode'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>subsystem</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='startupPolicy'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>default</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>mandatory</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>requisite</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>optional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='subsysType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>usb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pci</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>scsi</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='capsType'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='pciBackend'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </hostdev>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <rng supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio-transitional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtio-non-transitional</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendModel'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>random</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>egd</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>builtin</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </rng>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <filesystem supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='driverType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>path</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>handle</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>virtiofs</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </filesystem>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <tpm supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tpm-tis</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tpm-crb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendModel'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>emulator</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>external</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendVersion'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>2.0</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </tpm>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <redirdev supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='bus'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>usb</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </redirdev>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <channel supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pty</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>unix</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </channel>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <crypto supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>qemu</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendModel'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>builtin</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </crypto>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <interface supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='backendType'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>default</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>passt</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </interface>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <panic supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='model'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>isa</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>hyperv</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </panic>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <console supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='type'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>null</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vc</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pty</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>dev</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>file</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>pipe</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>stdio</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>udp</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tcp</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>unix</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>qemu-vdagent</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>dbus</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </console>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </devices>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  <features>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <gic supported='no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <vmcoreinfo supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <genid supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <backingStoreInput supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <backup supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <async-teardown supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <ps2 supported='yes'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <sev supported='no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <sgx supported='no'/>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <hyperv supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='features'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>relaxed</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vapic</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>spinlocks</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vpindex</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>runtime</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>synic</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>stimer</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>reset</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>vendor_id</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>frequencies</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>reenlightenment</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tlbflush</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>ipi</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>avic</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>emsr_bitmap</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>xmm_input</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <defaults>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <spinlocks>4095</spinlocks>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <stimer_direct>on</stimer_direct>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <tlbflush_direct>on</tlbflush_direct>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <tlbflush_extended>on</tlbflush_extended>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </defaults>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </hyperv>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    <launchSecurity supported='yes'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      <enum name='sectype'>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:        <value>tdx</value>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:      </enum>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:    </launchSecurity>
Nov 23 16:01:51 np0005532763 nova_compute[230316]:  </features>
Nov 23 16:01:51 np0005532763 nova_compute[230316]: </domainCapabilities>
Nov 23 16:01:51 np0005532763 nova_compute[230316]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.174 230320 DEBUG nova.virt.libvirt.host [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.175 230320 INFO nova.virt.libvirt.host [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Secure Boot support detected#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.177 230320 INFO nova.virt.libvirt.driver [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.177 230320 INFO nova.virt.libvirt.driver [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.187 230320 DEBUG nova.virt.libvirt.driver [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.203 230320 INFO nova.virt.node [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Determined node identity 20c32e0a-de2c-427c-9273-fac11e2660f4 from /var/lib/nova/compute_id#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.213 230320 WARNING nova.compute.manager [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Compute nodes ['20c32e0a-de2c-427c-9273-fac11e2660f4'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.229 230320 INFO nova.compute.manager [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.250 230320 WARNING nova.compute.manager [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.251 230320 DEBUG oslo_concurrency.lockutils [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.251 230320 DEBUG oslo_concurrency.lockutils [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.251 230320 DEBUG oslo_concurrency.lockutils [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.251 230320 DEBUG nova.compute.resource_tracker [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.251 230320 DEBUG oslo_concurrency.processutils [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:01:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:51.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:51 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:01:51 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3600869024' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:01:51 np0005532763 nova_compute[230316]: 2025-11-23 21:01:51.742 230320 DEBUG oslo_concurrency.processutils [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:01:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:51 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:51 np0005532763 systemd[1]: Starting libvirt nodedev daemon...
Nov 23 16:01:51 np0005532763 systemd[1]: Started libvirt nodedev daemon.
Nov 23 16:01:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:52 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2b40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:52 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:52.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:52 np0005532763 nova_compute[230316]: 2025-11-23 21:01:52.137 230320 WARNING nova.virt.libvirt.driver [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:01:52 np0005532763 nova_compute[230316]: 2025-11-23 21:01:52.139 230320 DEBUG nova.compute.resource_tracker [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5246MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:01:52 np0005532763 nova_compute[230316]: 2025-11-23 21:01:52.139 230320 DEBUG oslo_concurrency.lockutils [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:01:52 np0005532763 nova_compute[230316]: 2025-11-23 21:01:52.140 230320 DEBUG oslo_concurrency.lockutils [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:01:52 np0005532763 nova_compute[230316]: 2025-11-23 21:01:52.153 230320 WARNING nova.compute.resource_tracker [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] No compute node record for compute-2.ctlplane.example.com:20c32e0a-de2c-427c-9273-fac11e2660f4: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 20c32e0a-de2c-427c-9273-fac11e2660f4 could not be found.#033[00m
Nov 23 16:01:52 np0005532763 nova_compute[230316]: 2025-11-23 21:01:52.178 230320 INFO nova.compute.resource_tracker [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: 20c32e0a-de2c-427c-9273-fac11e2660f4#033[00m
Nov 23 16:01:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:01:52.214 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:01:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:01:52.215 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:01:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:01:52.215 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:01:52 np0005532763 nova_compute[230316]: 2025-11-23 21:01:52.220 230320 DEBUG nova.compute.resource_tracker [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:01:52 np0005532763 nova_compute[230316]: 2025-11-23 21:01:52.221 230320 DEBUG nova.compute.resource_tracker [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:01:52 np0005532763 python3.9[231248]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 23 16:01:52 np0005532763 systemd[1]: Stopping nova_compute container...
Nov 23 16:01:52 np0005532763 nova_compute[230316]: 2025-11-23 21:01:52.543 230320 DEBUG oslo_concurrency.lockutils [None req-da70f04c-3a94-4547-8ee1-efe17a491276 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.402s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:01:52 np0005532763 nova_compute[230316]: 2025-11-23 21:01:52.543 230320 DEBUG oslo_concurrency.lockutils [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:01:52 np0005532763 nova_compute[230316]: 2025-11-23 21:01:52.543 230320 DEBUG oslo_concurrency.lockutils [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:01:52 np0005532763 nova_compute[230316]: 2025-11-23 21:01:52.544 230320 DEBUG oslo_concurrency.lockutils [None req-e9079c1b-ea21-42ac-ab05-51561f7c0947 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:01:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:52 np0005532763 virtqemud[230850]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 23 16:01:52 np0005532763 virtqemud[230850]: hostname: compute-2
Nov 23 16:01:52 np0005532763 virtqemud[230850]: End of file while reading data: Input/output error
Nov 23 16:01:52 np0005532763 systemd[1]: libpod-f3c0cc402a22f5be8671accde6980d0d76ac2bdc1ca3540b965dffc485d1be66.scope: Deactivated successfully.
Nov 23 16:01:52 np0005532763 systemd[1]: libpod-f3c0cc402a22f5be8671accde6980d0d76ac2bdc1ca3540b965dffc485d1be66.scope: Consumed 3.569s CPU time.
Nov 23 16:01:52 np0005532763 podman[231254]: 2025-11-23 21:01:52.933771801 +0000 UTC m=+0.453973158 container died f3c0cc402a22f5be8671accde6980d0d76ac2bdc1ca3540b965dffc485d1be66 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:01:52 np0005532763 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f3c0cc402a22f5be8671accde6980d0d76ac2bdc1ca3540b965dffc485d1be66-userdata-shm.mount: Deactivated successfully.
Nov 23 16:01:52 np0005532763 systemd[1]: var-lib-containers-storage-overlay-ff1899460981cfa6c12cb6f79592d054e2fa693cee5abef6995622ed1beea179-merged.mount: Deactivated successfully.
Nov 23 16:01:53 np0005532763 podman[231254]: 2025-11-23 21:01:53.021694758 +0000 UTC m=+0.541896155 container cleanup f3c0cc402a22f5be8671accde6980d0d76ac2bdc1ca3540b965dffc485d1be66 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 16:01:53 np0005532763 podman[231254]: nova_compute
Nov 23 16:01:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:53 np0005532763 podman[231284]: nova_compute
Nov 23 16:01:53 np0005532763 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 23 16:01:53 np0005532763 systemd[1]: Stopped nova_compute container.
Nov 23 16:01:53 np0005532763 systemd[1]: Starting nova_compute container...
Nov 23 16:01:53 np0005532763 systemd[1]: Started libcrun container.
Nov 23 16:01:53 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff1899460981cfa6c12cb6f79592d054e2fa693cee5abef6995622ed1beea179/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 23 16:01:53 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff1899460981cfa6c12cb6f79592d054e2fa693cee5abef6995622ed1beea179/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 23 16:01:53 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff1899460981cfa6c12cb6f79592d054e2fa693cee5abef6995622ed1beea179/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 23 16:01:53 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff1899460981cfa6c12cb6f79592d054e2fa693cee5abef6995622ed1beea179/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 23 16:01:53 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff1899460981cfa6c12cb6f79592d054e2fa693cee5abef6995622ed1beea179/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 23 16:01:53 np0005532763 podman[231296]: 2025-11-23 21:01:53.29614713 +0000 UTC m=+0.130529354 container init f3c0cc402a22f5be8671accde6980d0d76ac2bdc1ca3540b965dffc485d1be66 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 16:01:53 np0005532763 podman[231296]: 2025-11-23 21:01:53.308134296 +0000 UTC m=+0.142516470 container start f3c0cc402a22f5be8671accde6980d0d76ac2bdc1ca3540b965dffc485d1be66 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 16:01:53 np0005532763 podman[231296]: nova_compute
Nov 23 16:01:53 np0005532763 nova_compute[231311]: + sudo -E kolla_set_configs
Nov 23 16:01:53 np0005532763 systemd[1]: Started nova_compute container.
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Validating config file
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Copying service configuration files
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Deleting /etc/ceph
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Creating directory /etc/ceph
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /etc/ceph
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Writing out command to execute
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 23 16:01:53 np0005532763 nova_compute[231311]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 23 16:01:53 np0005532763 nova_compute[231311]: ++ cat /run_command
Nov 23 16:01:53 np0005532763 nova_compute[231311]: + CMD=nova-compute
Nov 23 16:01:53 np0005532763 nova_compute[231311]: + ARGS=
Nov 23 16:01:53 np0005532763 nova_compute[231311]: + sudo kolla_copy_cacerts
Nov 23 16:01:53 np0005532763 nova_compute[231311]: + [[ ! -n '' ]]
Nov 23 16:01:53 np0005532763 nova_compute[231311]: + . kolla_extend_start
Nov 23 16:01:53 np0005532763 nova_compute[231311]: Running command: 'nova-compute'
Nov 23 16:01:53 np0005532763 nova_compute[231311]: + echo 'Running command: '\''nova-compute'\'''
Nov 23 16:01:53 np0005532763 nova_compute[231311]: + umask 0022
Nov 23 16:01:53 np0005532763 nova_compute[231311]: + exec nova-compute
Nov 23 16:01:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:01:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:53.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:01:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:53 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:53 : epoch 6923760c : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:01:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:53 : epoch 6923760c : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:01:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:54 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:54 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2b40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:54.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:01:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.230 231315 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.231 231315 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.231 231315 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.231 231315 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.347 231315 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.374 231315 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.374 231315 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 23 16:01:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:01:55.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:55 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.784 231315 INFO nova.virt.driver [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 23 16:01:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:01:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.898 231315 INFO nova.compute.provider_config [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.905 231315 DEBUG oslo_concurrency.lockutils [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.905 231315 DEBUG oslo_concurrency.lockutils [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.906 231315 DEBUG oslo_concurrency.lockutils [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.906 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.906 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.906 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.906 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.906 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.907 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.907 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.907 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.907 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.907 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.907 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.907 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.908 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.908 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.908 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.908 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.908 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.908 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.908 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.909 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.909 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.909 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.909 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.909 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.909 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.909 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.909 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.910 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.910 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.910 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.910 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.910 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.910 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.910 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.911 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.911 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.911 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.911 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.911 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.911 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.911 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.912 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.912 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.912 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.912 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.912 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.913 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.913 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.913 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.913 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.913 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.913 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.913 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.914 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.914 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.914 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.914 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.914 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.914 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.914 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.915 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.915 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.915 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.915 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.915 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.915 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.915 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.915 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.916 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.916 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.917 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.917 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.917 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.917 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.917 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.917 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.917 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.918 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.918 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.918 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.918 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.918 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.918 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.918 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.919 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.919 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.919 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.919 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.919 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.919 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.919 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.920 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.920 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.920 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.920 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.920 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.920 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.920 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.921 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.921 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.921 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.921 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.921 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.921 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.921 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.922 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.922 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.922 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.922 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.922 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.922 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.922 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.922 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.923 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.923 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.923 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.923 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.923 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.923 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.923 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.924 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.924 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.924 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.924 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.924 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.924 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.924 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.925 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.925 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.925 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.925 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.925 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.925 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.925 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.926 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.926 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.926 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.926 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.926 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.926 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.926 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.926 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.927 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.927 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.927 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.927 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.927 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.927 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.927 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.928 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.928 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.928 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.928 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.928 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.928 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.929 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.929 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.929 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.929 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.929 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.929 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.929 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.930 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.930 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.930 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.930 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.930 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.930 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.930 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.931 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.931 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.931 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.931 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.931 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.931 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.931 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.932 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.932 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.932 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.932 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.932 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.932 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.932 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.933 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.933 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.933 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.933 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.933 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.933 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.933 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.934 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.934 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.934 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.934 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.934 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.934 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.934 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.934 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.935 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.935 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.935 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.935 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.935 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.935 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.936 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.936 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.936 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.936 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.936 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.936 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.936 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.937 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.937 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.937 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.937 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.937 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.937 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.937 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.938 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.938 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.938 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.938 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.938 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.938 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.938 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.939 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.939 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.939 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.939 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.939 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.939 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.939 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.940 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.940 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.940 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.940 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.940 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.940 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.940 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.941 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.941 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.941 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.941 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.941 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.941 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.941 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.942 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.942 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.942 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.942 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.942 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.942 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.942 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.943 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.943 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.943 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.943 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.943 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.943 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.943 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.944 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.944 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.944 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.944 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.945 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.945 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.945 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.945 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.945 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.945 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.945 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.946 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.946 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.946 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.946 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.946 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.946 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.946 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.947 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.947 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.947 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.947 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.947 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.947 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.947 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.948 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.948 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.948 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.948 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.948 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.948 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.948 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.949 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.949 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.949 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.949 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.949 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.949 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.949 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.950 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.950 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.950 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.950 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.950 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.950 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.951 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.951 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.951 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.951 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.951 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.951 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.951 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.952 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.952 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.952 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.952 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.952 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.952 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.952 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.953 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.953 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.953 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.953 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.953 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.953 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.953 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.953 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.954 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.954 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.954 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.954 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.954 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.954 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.954 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.955 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.955 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.955 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.955 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.955 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.955 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.955 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.956 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.956 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.956 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.956 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.956 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.956 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.956 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.957 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.957 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.957 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.957 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.957 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.957 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.957 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.958 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.958 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.958 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.958 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.958 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.958 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.959 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.959 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.959 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.959 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.959 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.959 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.959 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.960 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.960 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.960 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.960 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.960 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.960 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.960 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.961 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.961 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.961 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.961 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.961 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.961 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.961 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.961 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.962 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.962 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.962 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.962 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.962 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.962 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.962 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.963 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.963 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.963 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.963 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.963 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.963 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.963 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.964 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.964 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.964 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.964 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.964 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.964 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.964 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.965 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.965 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.965 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.965 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.965 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.965 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.965 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.966 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.966 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.966 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.966 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.966 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.966 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.966 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.966 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.967 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.967 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.967 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.967 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.967 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.967 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.967 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.968 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.968 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.968 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.968 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.968 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.968 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.968 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.969 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.969 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.969 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.969 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.969 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.969 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.969 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.970 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.970 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.970 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.970 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.970 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.970 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.970 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.970 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.971 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.971 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.971 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.971 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.971 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.971 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.972 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.972 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.972 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.972 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.972 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.972 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.973 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.973 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.973 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.973 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.973 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.973 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.973 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.974 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.974 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.974 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.974 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.974 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.974 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.974 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.975 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.975 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.975 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.975 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.975 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.975 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.975 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.976 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.976 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.976 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.976 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.976 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.976 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.976 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.977 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.977 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.977 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.977 231315 WARNING oslo_config.cfg [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 23 16:01:55 np0005532763 nova_compute[231311]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 23 16:01:55 np0005532763 nova_compute[231311]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 23 16:01:55 np0005532763 nova_compute[231311]: and ``live_migration_inbound_addr`` respectively.
Nov 23 16:01:55 np0005532763 nova_compute[231311]: ).  Its value may be silently ignored in the future.#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.978 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.978 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.978 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.978 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.978 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.978 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.979 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.979 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.979 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.979 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.979 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.979 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.979 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.980 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.980 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.980 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.980 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.980 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.980 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.rbd_secret_uuid        = 03808be8-ae4a-5548-82e6-4a294f1bc627 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.980 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.981 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.981 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.981 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.981 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.981 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.981 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.981 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.982 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.982 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.982 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.982 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.982 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.982 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.983 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.983 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.983 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.983 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.983 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.983 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.984 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.985 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.985 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.985 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.985 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.985 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.985 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.985 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.986 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.986 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.986 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.986 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.986 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.986 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.987 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.987 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.987 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.987 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.987 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.987 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.987 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.987 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.988 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.988 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.988 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.988 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.988 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.988 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.988 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.989 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.989 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.989 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.989 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.989 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.989 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.989 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.989 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.990 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.990 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.990 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.990 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.990 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.990 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.990 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.991 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.991 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.991 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.991 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.991 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.991 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.991 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.991 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.992 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.992 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.992 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.992 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.992 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.992 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.992 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.993 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.993 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.993 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.993 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.993 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.993 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.993 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.993 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.994 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.994 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.994 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.994 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.994 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.994 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.994 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.994 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.995 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.995 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.995 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.995 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.995 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.995 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.995 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.996 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.996 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.996 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.996 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.996 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.996 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.996 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.997 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.997 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.997 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.997 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.997 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.997 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.997 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.997 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.998 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.998 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.998 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.998 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.998 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.998 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.999 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.999 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.999 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.999 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.999 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.999 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:55 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.999 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:55.999 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.000 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.000 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.000 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.000 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.000 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.000 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.000 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.001 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.001 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.001 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.001 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.001 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.001 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.001 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.002 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.002 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.002 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.002 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.002 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.002 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.002 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.002 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.003 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.003 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.003 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.003 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.003 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.003 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.003 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.004 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.004 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.004 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.004 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.004 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.004 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.004 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.005 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.005 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.005 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.005 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.005 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.005 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.005 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.005 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.006 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.006 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.006 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.006 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.006 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.006 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.007 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.007 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.007 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.007 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.007 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.007 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.007 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.007 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.008 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.008 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.008 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.008 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.008 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.008 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.008 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.009 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.009 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.009 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.009 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.009 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.009 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.009 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.009 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.010 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.010 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.010 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.010 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.010 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.010 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.010 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.010 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.011 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.011 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.011 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.011 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.011 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.012 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.012 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:56 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2c8004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.012 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.012 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.012 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.012 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.012 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.013 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.013 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.013 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.013 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.013 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.013 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.013 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.014 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.014 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.014 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.014 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.014 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.014 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.014 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.015 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.015 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.015 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.015 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.015 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.015 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.015 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.016 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.016 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.016 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.016 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.016 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.016 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.016 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.016 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.017 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.017 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.017 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.017 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.017 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.017 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.017 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.017 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.018 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.018 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.018 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.018 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.018 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.018 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.018 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.019 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.019 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.019 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[226372]: 23/11/2025 21:01:56 : epoch 6923760c : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fe2dc0021d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.019 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.019 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.019 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.020 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.020 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.020 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.020 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.020 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.020 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.020 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.021 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.021 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.021 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.021 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.021 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.021 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.021 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.022 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.022 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.022 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.022 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.022 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.022 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.022 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.022 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.023 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.023 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.023 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.023 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.023 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.023 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.023 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.024 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.024 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.024 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.024 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.024 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.024 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.024 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.024 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.025 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.025 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.025 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.025 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.025 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.025 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.025 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.026 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.026 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.026 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.026 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.026 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.026 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.026 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.027 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.027 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.027 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.027 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.027 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.027 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.027 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.027 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.028 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.028 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.028 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.028 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.028 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.028 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.028 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.028 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.029 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.029 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.029 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.029 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.029 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.029 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.029 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.030 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.030 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.030 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.030 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.030 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.030 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.030 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.030 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.031 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.031 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.031 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.031 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.031 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.031 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.031 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.032 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.032 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.032 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.032 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.032 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.032 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.032 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.032 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.033 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.033 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.033 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.033 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.033 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.033 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.033 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.034 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.034 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.034 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.034 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.034 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.034 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:01:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.034 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.034 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.035 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.035 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.035 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.035 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.035 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.035 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.035 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.036 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.036 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.036 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.036 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.036 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.036 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.036 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.037 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.037 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.037 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.037 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.037 231315 DEBUG oslo_service.service [None req-707e598f-813e-478d-811f-b1c6aefeb6cf - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.038 231315 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.047 231315 INFO nova.virt.node [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Determined node identity 20c32e0a-de2c-427c-9273-fac11e2660f4 from /var/lib/nova/compute_id#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.048 231315 DEBUG nova.virt.libvirt.host [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.048 231315 DEBUG nova.virt.libvirt.host [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.049 231315 DEBUG nova.virt.libvirt.host [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.049 231315 DEBUG nova.virt.libvirt.host [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.062 231315 DEBUG nova.virt.libvirt.host [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7ff8e5a15250> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.066 231315 DEBUG nova.virt.libvirt.host [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7ff8e5a15250> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.067 231315 INFO nova.virt.libvirt.driver [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.075 231315 INFO nova.virt.libvirt.host [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Libvirt host capabilities <capabilities>
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <host>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <uuid>e38e4d8b-cfb8-4d24-8752-3d68cd15bb48</uuid>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <cpu>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <arch>x86_64</arch>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model>EPYC-Rome-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <vendor>AMD</vendor>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <microcode version='16777317'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <signature family='23' model='49' stepping='0'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='x2apic'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='tsc-deadline'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='osxsave'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='hypervisor'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='tsc_adjust'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='spec-ctrl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='stibp'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='arch-capabilities'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='ssbd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='cmp_legacy'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='topoext'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='virt-ssbd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='lbrv'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='tsc-scale'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='vmcb-clean'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='pause-filter'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='pfthreshold'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='svme-addr-chk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='rdctl-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='skip-l1dfl-vmentry'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='mds-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature name='pschange-mc-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <pages unit='KiB' size='4'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <pages unit='KiB' size='2048'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <pages unit='KiB' size='1048576'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </cpu>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <power_management>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <suspend_mem/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </power_management>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <iommu support='no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <migration_features>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <live/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <uri_transports>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <uri_transport>tcp</uri_transport>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <uri_transport>rdma</uri_transport>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </uri_transports>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </migration_features>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <topology>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <cells num='1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <cell id='0'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:          <memory unit='KiB'>7864320</memory>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:          <pages unit='KiB' size='4'>1966080</pages>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:          <pages unit='KiB' size='2048'>0</pages>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:          <distances>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:            <sibling id='0' value='10'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:          </distances>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:          <cpus num='8'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:          </cpus>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        </cell>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </cells>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </topology>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <cache>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </cache>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <secmodel>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model>selinux</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <doi>0</doi>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </secmodel>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <secmodel>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model>dac</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <doi>0</doi>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </secmodel>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  </host>
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <guest>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <os_type>hvm</os_type>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <arch name='i686'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <wordsize>32</wordsize>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <domain type='qemu'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <domain type='kvm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </arch>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <features>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <pae/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <nonpae/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <acpi default='on' toggle='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <apic default='on' toggle='no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <cpuselection/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <deviceboot/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <disksnapshot default='on' toggle='no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <externalSnapshot/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </features>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  </guest>
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <guest>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <os_type>hvm</os_type>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <arch name='x86_64'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <wordsize>64</wordsize>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <domain type='qemu'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <domain type='kvm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </arch>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <features>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <acpi default='on' toggle='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <apic default='on' toggle='no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <cpuselection/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <deviceboot/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <disksnapshot default='on' toggle='no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <externalSnapshot/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </features>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  </guest>
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 
Nov 23 16:01:56 np0005532763 nova_compute[231311]: </capabilities>
Nov 23 16:01:56 np0005532763 nova_compute[231311]: #033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.079 231315 DEBUG nova.virt.libvirt.volume.mount [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.087 231315 DEBUG nova.virt.libvirt.host [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.093 231315 DEBUG nova.virt.libvirt.host [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 23 16:01:56 np0005532763 nova_compute[231311]: <domainCapabilities>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <path>/usr/libexec/qemu-kvm</path>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <domain>kvm</domain>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <arch>i686</arch>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <vcpu max='4096'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <iothreads supported='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <os supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <enum name='firmware'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <loader supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='type'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>rom</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>pflash</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='readonly'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>yes</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>no</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='secure'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>no</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </loader>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  </os>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <cpu>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <mode name='host-passthrough' supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='hostPassthroughMigratable'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>on</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>off</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </mode>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <mode name='maximum' supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='maximumMigratable'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>on</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>off</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </mode>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <mode name='host-model' supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <vendor>AMD</vendor>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='x2apic'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='tsc-deadline'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='hypervisor'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='tsc_adjust'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='spec-ctrl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='stibp'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='ssbd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='cmp_legacy'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='overflow-recov'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='succor'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='ibrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='amd-ssbd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='virt-ssbd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='lbrv'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='tsc-scale'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='vmcb-clean'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='flushbyasid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='pause-filter'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='pfthreshold'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='svme-addr-chk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='disable' name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </mode>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <mode name='custom' supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-noTSX'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-v5'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cooperlake'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cooperlake-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cooperlake-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Denverton'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mpx'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Denverton-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mpx'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Denverton-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Denverton-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Dhyana-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Genoa'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amd-psfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='auto-ibrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='stibp-always-on'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Genoa-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amd-psfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='auto-ibrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='stibp-always-on'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Milan'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Milan-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Milan-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amd-psfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='stibp-always-on'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Rome'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Rome-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Rome-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Rome-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='GraniteRapids'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-tile'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fbsdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrc'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fzrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mcdt-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pbrsb-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='prefetchiti'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='psdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='GraniteRapids-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-tile'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fbsdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrc'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fzrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mcdt-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pbrsb-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='prefetchiti'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='psdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='GraniteRapids-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-tile'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx10'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx10-128'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx10-256'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx10-512'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cldemote'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fbsdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrc'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fzrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mcdt-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdir64b'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdiri'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pbrsb-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='prefetchiti'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='psdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Haswell'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Haswell-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Haswell-noTSX'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Haswell-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Haswell-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Haswell-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Haswell-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server-noTSX'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server-v5'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server-v6'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server-v7'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='IvyBridge'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='IvyBridge-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='IvyBridge-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='IvyBridge-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='KnightsMill'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-4fmaps'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-4vnniw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512er'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512pf'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='KnightsMill-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-4fmaps'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-4vnniw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512er'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512pf'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Opteron_G4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fma4'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xop'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Opteron_G4-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fma4'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xop'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Opteron_G5'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fma4'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tbm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xop'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Opteron_G5-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fma4'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tbm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xop'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='SapphireRapids'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-tile'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrc'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fzrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='SapphireRapids-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-tile'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrc'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fzrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='SapphireRapids-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-tile'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fbsdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrc'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fzrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='psdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='SapphireRapids-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-tile'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cldemote'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fbsdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrc'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fzrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdir64b'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdiri'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='psdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='SierraForest'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-ifma'/>
Nov 23 16:01:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-ne-convert'/>
Nov 23 16:01:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:01:56.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cmpccxadd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fbsdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mcdt-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pbrsb-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='psdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='SierraForest-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-ne-convert'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cmpccxadd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fbsdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mcdt-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pbrsb-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='psdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Client'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Client-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Client-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Client-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Client-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Client-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Server'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Server-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Server-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Server-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Server-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Server-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Server-v5'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Snowridge'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cldemote'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='core-capability'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdir64b'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdiri'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mpx'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='split-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Snowridge-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cldemote'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='core-capability'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdir64b'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdiri'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mpx'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='split-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Snowridge-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cldemote'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='core-capability'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdir64b'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdiri'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='split-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Snowridge-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cldemote'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='core-capability'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdir64b'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdiri'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='split-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Snowridge-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cldemote'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdir64b'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdiri'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='athlon'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='3dnow'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='3dnowext'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='athlon-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='3dnow'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='3dnowext'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='core2duo'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='core2duo-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='coreduo'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='coreduo-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='n270'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='n270-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='phenom'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='3dnow'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='3dnowext'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='phenom-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='3dnow'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='3dnowext'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </mode>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  </cpu>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <memoryBacking supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <enum name='sourceType'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <value>file</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <value>anonymous</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <value>memfd</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  </memoryBacking>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <devices>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <disk supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='diskDevice'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>disk</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>cdrom</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>floppy</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>lun</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='bus'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>fdc</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>scsi</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtio</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>usb</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>sata</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='model'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtio</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtio-transitional</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtio-non-transitional</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </disk>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <graphics supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='type'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>vnc</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>egl-headless</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>dbus</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </graphics>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <video supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='modelType'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>vga</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>cirrus</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtio</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>none</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>bochs</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>ramfb</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </video>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <hostdev supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='mode'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>subsystem</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='startupPolicy'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>default</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>mandatory</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>requisite</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>optional</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='subsysType'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>usb</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>pci</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>scsi</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='capsType'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='pciBackend'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </hostdev>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <rng supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='model'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtio</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtio-transitional</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtio-non-transitional</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='backendModel'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>random</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>egd</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>builtin</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </rng>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <filesystem supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='driverType'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>path</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>handle</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtiofs</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </filesystem>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <tpm supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='model'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>tpm-tis</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>tpm-crb</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='backendModel'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>emulator</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>external</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='backendVersion'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>2.0</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </tpm>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <redirdev supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='bus'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>usb</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </redirdev>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <channel supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='type'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>pty</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>unix</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </channel>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <crypto supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='model'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='type'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>qemu</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='backendModel'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>builtin</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </crypto>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <interface supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='backendType'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>default</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>passt</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </interface>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <panic supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='model'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>isa</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>hyperv</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </panic>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <console supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='type'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>null</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>vc</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>pty</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>dev</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>file</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>pipe</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>stdio</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>udp</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>tcp</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>unix</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>qemu-vdagent</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>dbus</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </console>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  </devices>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <features>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <gic supported='no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <vmcoreinfo supported='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <genid supported='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <backingStoreInput supported='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <backup supported='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <async-teardown supported='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <ps2 supported='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <sev supported='no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <sgx supported='no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <hyperv supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='features'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>relaxed</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>vapic</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>spinlocks</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>vpindex</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>runtime</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>synic</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>stimer</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>reset</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>vendor_id</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>frequencies</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>reenlightenment</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>tlbflush</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>ipi</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>avic</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>emsr_bitmap</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>xmm_input</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <defaults>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <spinlocks>4095</spinlocks>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <stimer_direct>on</stimer_direct>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <tlbflush_direct>on</tlbflush_direct>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <tlbflush_extended>on</tlbflush_extended>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </defaults>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </hyperv>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <launchSecurity supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='sectype'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>tdx</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </launchSecurity>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  </features>
Nov 23 16:01:56 np0005532763 nova_compute[231311]: </domainCapabilities>
Nov 23 16:01:56 np0005532763 nova_compute[231311]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.102 231315 DEBUG nova.virt.libvirt.host [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 23 16:01:56 np0005532763 nova_compute[231311]: <domainCapabilities>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <path>/usr/libexec/qemu-kvm</path>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <domain>kvm</domain>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <arch>i686</arch>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <vcpu max='240'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <iothreads supported='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <os supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <enum name='firmware'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <loader supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='type'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>rom</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>pflash</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='readonly'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>yes</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>no</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='secure'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>no</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </loader>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  </os>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <cpu>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <mode name='host-passthrough' supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='hostPassthroughMigratable'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>on</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>off</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </mode>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <mode name='maximum' supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='maximumMigratable'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>on</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>off</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </mode>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <mode name='host-model' supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <vendor>AMD</vendor>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='x2apic'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='tsc-deadline'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='hypervisor'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='tsc_adjust'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='spec-ctrl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='stibp'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='ssbd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='cmp_legacy'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='overflow-recov'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='succor'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='ibrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='amd-ssbd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='virt-ssbd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='lbrv'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='tsc-scale'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='vmcb-clean'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='flushbyasid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='pause-filter'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='pfthreshold'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='svme-addr-chk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='disable' name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </mode>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <mode name='custom' supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-noTSX'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-v5'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cooperlake'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cooperlake-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cooperlake-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Denverton'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mpx'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Denverton-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mpx'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Denverton-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Denverton-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Dhyana-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Genoa'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amd-psfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='auto-ibrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='stibp-always-on'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Genoa-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amd-psfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='auto-ibrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='stibp-always-on'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Milan'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Milan-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Milan-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amd-psfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='no-nested-data-bp'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='null-sel-clr-base'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='stibp-always-on'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Rome'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Rome-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Rome-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-Rome-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='EPYC-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='GraniteRapids'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-tile'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fbsdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrc'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fzrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mcdt-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pbrsb-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='prefetchiti'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='psdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='GraniteRapids-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-tile'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fbsdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrc'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fzrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mcdt-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pbrsb-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='prefetchiti'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='psdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='GraniteRapids-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-tile'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx10'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx10-128'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx10-256'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx10-512'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cldemote'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fbsdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrc'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fzrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mcdt-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdir64b'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdiri'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pbrsb-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='prefetchiti'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='psdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Haswell'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Haswell-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Haswell-noTSX'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Haswell-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Haswell-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Haswell-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Haswell-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server-noTSX'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server-v5'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server-v6'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Icelake-Server-v7'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='IvyBridge'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='IvyBridge-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='IvyBridge-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='IvyBridge-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='KnightsMill'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-4fmaps'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-4vnniw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512er'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512pf'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='KnightsMill-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-4fmaps'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-4vnniw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512er'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512pf'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Opteron_G4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fma4'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xop'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Opteron_G4-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fma4'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xop'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Opteron_G5'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fma4'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tbm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xop'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Opteron_G5-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fma4'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tbm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xop'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='SapphireRapids'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-tile'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrc'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fzrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='SapphireRapids-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-tile'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrc'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fzrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='SapphireRapids-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-tile'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fbsdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrc'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fzrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='psdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='SapphireRapids-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='amx-tile'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-fp16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-vpopcntdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bitalg'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vbmi2'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cldemote'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fbsdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrc'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fzrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='la57'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdir64b'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdiri'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='psdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='tsx-ldtrk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xfd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='SierraForest'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-ne-convert'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cmpccxadd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fbsdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mcdt-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pbrsb-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='psdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='SierraForest-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-ifma'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-ne-convert'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx-vnni-int8'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='bus-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cmpccxadd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fbsdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='fsrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mcdt-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pbrsb-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='psdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='sbdr-ssdp-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='serialize'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vaes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='vpclmulqdq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Client'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Client-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Client-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Client-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Client-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Client-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Server'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Server-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Server-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Server-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Server-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Server-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Skylake-Server-v5'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Snowridge'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cldemote'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='core-capability'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdir64b'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdiri'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mpx'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='split-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Snowridge-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cldemote'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='core-capability'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdir64b'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdiri'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mpx'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='split-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Snowridge-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cldemote'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='core-capability'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdir64b'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdiri'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='split-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Snowridge-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cldemote'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='core-capability'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdir64b'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdiri'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='split-lock-detect'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Snowridge-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='cldemote'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='gfni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdir64b'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='movdiri'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='athlon'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='3dnow'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='3dnowext'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='athlon-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='3dnow'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='3dnowext'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='core2duo'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='core2duo-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='coreduo'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='coreduo-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='n270'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='n270-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ss'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='phenom'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='3dnow'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='3dnowext'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='phenom-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='3dnow'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='3dnowext'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </mode>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  </cpu>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <memoryBacking supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <enum name='sourceType'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <value>file</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <value>anonymous</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <value>memfd</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  </memoryBacking>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <devices>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <disk supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='diskDevice'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>disk</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>cdrom</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>floppy</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>lun</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='bus'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>ide</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>fdc</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>scsi</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtio</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>usb</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>sata</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='model'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtio</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtio-transitional</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtio-non-transitional</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </disk>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <graphics supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='type'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>vnc</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>egl-headless</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>dbus</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </graphics>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <video supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='modelType'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>vga</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>cirrus</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtio</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>none</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>bochs</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>ramfb</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </video>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <hostdev supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='mode'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>subsystem</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='startupPolicy'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>default</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>mandatory</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>requisite</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>optional</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='subsysType'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>usb</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>pci</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>scsi</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='capsType'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='pciBackend'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </hostdev>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <rng supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='model'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtio</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtio-transitional</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtio-non-transitional</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='backendModel'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>random</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>egd</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>builtin</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </rng>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <filesystem supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='driverType'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>path</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>handle</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>virtiofs</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </filesystem>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <tpm supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='model'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>tpm-tis</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>tpm-crb</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='backendModel'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>emulator</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>external</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='backendVersion'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>2.0</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </tpm>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <redirdev supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='bus'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>usb</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </redirdev>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <channel supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='type'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>pty</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>unix</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </channel>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <crypto supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='model'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='type'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>qemu</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='backendModel'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>builtin</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </crypto>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <interface supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='backendType'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>default</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>passt</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </interface>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <panic supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='model'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>isa</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>hyperv</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </panic>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <console supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='type'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>null</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>vc</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>pty</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>dev</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>file</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>pipe</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>stdio</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>udp</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>tcp</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>unix</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>qemu-vdagent</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>dbus</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </console>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  </devices>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <features>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <gic supported='no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <vmcoreinfo supported='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <genid supported='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <backingStoreInput supported='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <backup supported='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <async-teardown supported='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <ps2 supported='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <sev supported='no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <sgx supported='no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <hyperv supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='features'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>relaxed</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>vapic</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>spinlocks</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>vpindex</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>runtime</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>synic</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>stimer</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>reset</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>vendor_id</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>frequencies</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>reenlightenment</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>tlbflush</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>ipi</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>avic</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>emsr_bitmap</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>xmm_input</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <defaults>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <spinlocks>4095</spinlocks>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <stimer_direct>on</stimer_direct>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <tlbflush_direct>on</tlbflush_direct>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <tlbflush_extended>on</tlbflush_extended>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </defaults>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </hyperv>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <launchSecurity supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='sectype'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>tdx</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </launchSecurity>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  </features>
Nov 23 16:01:56 np0005532763 nova_compute[231311]: </domainCapabilities>
Nov 23 16:01:56 np0005532763 nova_compute[231311]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.166 231315 DEBUG nova.virt.libvirt.host [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 23 16:01:56 np0005532763 nova_compute[231311]: 2025-11-23 21:01:56.173 231315 DEBUG nova.virt.libvirt.host [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 23 16:01:56 np0005532763 nova_compute[231311]: <domainCapabilities>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <path>/usr/libexec/qemu-kvm</path>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <domain>kvm</domain>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <arch>x86_64</arch>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <vcpu max='4096'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <iothreads supported='yes'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <os supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <enum name='firmware'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <value>efi</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <loader supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='type'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>rom</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>pflash</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='readonly'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>yes</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>no</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='secure'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>yes</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>no</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </loader>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  </os>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:  <cpu>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <mode name='host-passthrough' supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='hostPassthroughMigratable'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>on</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>off</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </mode>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <mode name='maximum' supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <enum name='maximumMigratable'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>on</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <value>off</value>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </enum>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </mode>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <mode name='host-model' supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <vendor>AMD</vendor>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='x2apic'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='tsc-deadline'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='hypervisor'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='tsc_adjust'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='spec-ctrl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='stibp'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='ssbd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='cmp_legacy'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='overflow-recov'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='succor'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='ibrs'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='amd-ssbd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='virt-ssbd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='lbrv'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='tsc-scale'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='vmcb-clean'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='flushbyasid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='pause-filter'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='pfthreshold'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='svme-addr-chk'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <feature policy='disable' name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    </mode>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:    <mode name='custom' supported='yes'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-noTSX'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Broadwell-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-v3'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-v4'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cascadelake-Server-v5'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cooperlake'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cooperlake-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Cooperlake-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512-bf16'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512bw'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512cd'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512dq'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512f'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vl'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='avx512vnni'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='hle'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='ibrs-all'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='invpcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pcid'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='pku'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='rtm'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='taa-no'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='xsaves'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Denverton'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mpx'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Denverton-v1'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='mpx'/>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      </blockers>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:      <blockers model='Denverton-v2'>
Nov 23 16:01:56 np0005532763 nova_compute[231311]:        <feature name='erms'/>
Nov 23 16:02:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:17 np0005532763 rsyslogd[1011]: imjournal: 3202 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Nov 23 16:02:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:17.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:18.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:19.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:02:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:20.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:21.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:22.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:22 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:02:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:22 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:02:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:23 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:02:23 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1059527712' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:02:23 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:02:23 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1059527712' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:02:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:02:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:23.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:02:23 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:02:23 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/156012811' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:02:23 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:02:23 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/156012811' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:02:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:24.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:02:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:25.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:26.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:27.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:02:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:28.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:28 : epoch 69237658 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 16:02:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:29.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:29 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:02:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:30 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1600014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:30 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff148000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:30.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:31.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210231 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 16:02:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:31 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:32 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:32 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1600021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:32.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:32 np0005532763 podman[231983]: 2025-11-23 21:02:32.224195825 +0000 UTC m=+0.096333264 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 16:02:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:33.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:33 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:34 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:34 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:34.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:02:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:35.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:35 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1600021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:36 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:36 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:36.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:37.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:37 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:38 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:38 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1600021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:38.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:39 np0005532763 podman[232009]: 2025-11-23 21:02:39.257120494 +0000 UTC m=+0.136354077 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 16:02:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:39.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:02:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:39 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16c0091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:40 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:40 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:40.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:41.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:41 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1600021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:42 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16c0091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:42 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16c0091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:42.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:42 np0005532763 podman[232062]: 2025-11-23 21:02:42.355559374 +0000 UTC m=+0.084567104 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible)
Nov 23 16:02:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:43.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:43 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16c0091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:44 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff148002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:44 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:44.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:02:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:44 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Nov 23 16:02:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:45.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:45 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:46 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:46 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:46.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:47.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:47 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210248 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 16:02:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:48 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff148002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:48 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:48.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:48 np0005532763 nova_compute[231311]: 2025-11-23 21:02:48.277 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:02:48 np0005532763 nova_compute[231311]: 2025-11-23 21:02:48.293 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:02:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:49 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:02:49 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:02:49 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:02:49 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:02:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:49.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:02:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:49 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:50 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:50 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff148003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:50.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:51.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:51 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:52 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:52 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:02:52.215 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:02:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:02:52.216 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:02:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:02:52.216 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:02:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:52.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:53.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:53 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff148003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:54 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:54 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:54.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:54 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:02:54 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:02:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:02:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.385 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.386 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.386 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.387 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.412 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.412 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.413 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.413 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.414 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.414 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.414 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.415 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.415 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.460 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.461 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.461 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.461 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.462 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:02:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:55.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:55 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:55 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:02:55 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3108314033' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:02:55 np0005532763 nova_compute[231311]: 2025-11-23 21:02:55.950 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:02:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:55 : epoch 69237658 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:02:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:56 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff148003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:56 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:56 np0005532763 nova_compute[231311]: 2025-11-23 21:02:56.195 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:02:56 np0005532763 nova_compute[231311]: 2025-11-23 21:02:56.197 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5255MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:02:56 np0005532763 nova_compute[231311]: 2025-11-23 21:02:56.197 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:02:56 np0005532763 nova_compute[231311]: 2025-11-23 21:02:56.198 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:02:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:56.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:56 np0005532763 nova_compute[231311]: 2025-11-23 21:02:56.329 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:02:56 np0005532763 nova_compute[231311]: 2025-11-23 21:02:56.329 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:02:56 np0005532763 nova_compute[231311]: 2025-11-23 21:02:56.400 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:02:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:56 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:02:56 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1061901012' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:02:56 np0005532763 nova_compute[231311]: 2025-11-23 21:02:56.914 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:02:56 np0005532763 nova_compute[231311]: 2025-11-23 21:02:56.923 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:02:56 np0005532763 nova_compute[231311]: 2025-11-23 21:02:56.988 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:02:56 np0005532763 nova_compute[231311]: 2025-11-23 21:02:56.991 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:02:56 np0005532763 nova_compute[231311]: 2025-11-23 21:02:56.992 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.794s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:02:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:02:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:57.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:02:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:57 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff148003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:58 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:58 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:02:58.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:58 : epoch 69237658 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:02:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:58 : epoch 69237658 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:02:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:02:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:02:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:02:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:02:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:02:59.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:02:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:02:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:02:59 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:02:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:02:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:03:00 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff148003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:03:00 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:00.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:01.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:03:01 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:03:01 : epoch 69237658 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 16:03:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:03:02 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:03:02 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:03:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:02.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:03:02 np0005532763 podman[232281]: 2025-11-23 21:03:02.45819503 +0000 UTC m=+0.088570526 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 23 16:03:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:03.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[231885]: 23/11/2025 21:03:03 : epoch 69237658 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144003820 fd 38 proxy ignored for local
Nov 23 16:03:03 np0005532763 kernel: ganesha.nfsd[231979]: segfault at 50 ip 00007ff21970132e sp 00007ff1d27fb210 error 4 in libntirpc.so.5.8[7ff2196e6000+2c000] likely on CPU 7 (core 0, socket 7)
Nov 23 16:03:03 np0005532763 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Nov 23 16:03:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:03 np0005532763 systemd[1]: Started Process Core Dump (PID 232303/UID 0).
Nov 23 16:03:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:04.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:04 np0005532763 systemd-coredump[232304]: Process 231889 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 53:#012#0  0x00007ff21970132e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Nov 23 16:03:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:03:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:04 np0005532763 systemd[1]: systemd-coredump@12-232303-0.service: Deactivated successfully.
Nov 23 16:03:04 np0005532763 podman[232310]: 2025-11-23 21:03:04.925073819 +0000 UTC m=+0.026836194 container died 891b611a201a7e829b916f9467bdf6d4443f54ef165be28adc2724cb3826d774 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 23 16:03:04 np0005532763 systemd[1]: var-lib-containers-storage-overlay-22825facc1d9a44a83ac1b7cf3a0cc00a60d9abc9e0b68d6c31878646d6ab118-merged.mount: Deactivated successfully.
Nov 23 16:03:04 np0005532763 podman[232310]: 2025-11-23 21:03:04.963776134 +0000 UTC m=+0.065538509 container remove 891b611a201a7e829b916f9467bdf6d4443f54ef165be28adc2724cb3826d774 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 23 16:03:04 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Main process exited, code=exited, status=139/n/a
Nov 23 16:03:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:05 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Failed with result 'exit-code'.
Nov 23 16:03:05 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.645s CPU time.
Nov 23 16:03:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:05.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:06.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:07.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210308 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 16:03:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:08.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:08 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Nov 23 16:03:08 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/362349620' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Nov 23 16:03:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:09.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:03:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210309 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 16:03:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:10 np0005532763 podman[232359]: 2025-11-23 21:03:10.240717692 +0000 UTC m=+0.121981084 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:03:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:10.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:11.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:12.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:13 np0005532763 podman[232389]: 2025-11-23 21:03:13.221925585 +0000 UTC m=+0.098139615 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 23 16:03:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:13.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:14.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:03:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:15 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Scheduled restart job, restart counter is at 13.
Nov 23 16:03:15 np0005532763 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 16:03:15 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.645s CPU time.
Nov 23 16:03:15 np0005532763 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 16:03:15 np0005532763 podman[232456]: 2025-11-23 21:03:15.654478361 +0000 UTC m=+0.066852407 container create 261f1e1c2ba33cb0b64b03600924312ab55d6f6e98599ee1952871f374e85ba9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Nov 23 16:03:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:15.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:15 np0005532763 podman[232456]: 2025-11-23 21:03:15.625811137 +0000 UTC m=+0.038185193 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 16:03:15 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee7bbe9237f29dd575aab2a8d26fde60a3d4d12af522b8c612a8302538a35ff5/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Nov 23 16:03:15 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee7bbe9237f29dd575aab2a8d26fde60a3d4d12af522b8c612a8302538a35ff5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 16:03:15 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee7bbe9237f29dd575aab2a8d26fde60a3d4d12af522b8c612a8302538a35ff5/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 16:03:15 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee7bbe9237f29dd575aab2a8d26fde60a3d4d12af522b8c612a8302538a35ff5/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.dqbktw-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 16:03:15 np0005532763 podman[232456]: 2025-11-23 21:03:15.756925106 +0000 UTC m=+0.169299182 container init 261f1e1c2ba33cb0b64b03600924312ab55d6f6e98599ee1952871f374e85ba9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Nov 23 16:03:15 np0005532763 podman[232456]: 2025-11-23 21:03:15.773526002 +0000 UTC m=+0.185900048 container start 261f1e1c2ba33cb0b64b03600924312ab55d6f6e98599ee1952871f374e85ba9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Nov 23 16:03:15 np0005532763 bash[232456]: 261f1e1c2ba33cb0b64b03600924312ab55d6f6e98599ee1952871f374e85ba9
Nov 23 16:03:15 np0005532763 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 16:03:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:15 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Nov 23 16:03:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:15 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Nov 23 16:03:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:15 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Nov 23 16:03:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:15 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Nov 23 16:03:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:15 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Nov 23 16:03:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:15 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Nov 23 16:03:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:15 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Nov 23 16:03:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:15 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:03:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:16.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:17.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:18.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:19.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:03:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:20.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:21.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:21 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:03:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:21 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:03:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:03:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:22.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:03:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:23.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:24.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:03:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:25.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:26.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:27.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Nov 23 16:03:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:27 : epoch 69237693 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 16:03:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:28 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b1c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:28 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b10001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:28.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:28 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Nov 23 16:03:28 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3674086929' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Nov 23 16:03:28 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Nov 23 16:03:28 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/395877405' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Nov 23 16:03:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:29.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:03:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:29 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:30 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:30 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:30.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:31.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210331 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 16:03:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:31 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b10001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:32 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:32 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:32.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:33 np0005532763 podman[232571]: 2025-11-23 21:03:33.222903392 +0000 UTC m=+0.091507259 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 23 16:03:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:33.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:33 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:34 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b10001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:34 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000056s ======
Nov 23 16:03:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:34.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Nov 23 16:03:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:03:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:35.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:35 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:36 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:36 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b10001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:36.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:37.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:37 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:38 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:38 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:38.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:39.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:03:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:39 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b10001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:40 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:40 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:40.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:41 np0005532763 podman[232598]: 2025-11-23 21:03:41.259057772 +0000 UTC m=+0.131322946 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 16:03:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:41.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:41 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:42 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b10001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:42 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:42.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:43.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:43 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:44 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:44 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b10001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:44 np0005532763 podman[232653]: 2025-11-23 21:03:44.219369517 +0000 UTC m=+0.097299691 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 23 16:03:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:44.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:03:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:45.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:45 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:46 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af0002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:46 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:46.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:47.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:47 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b10001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:48 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:48 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:48.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:03:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:49.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:03:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:03:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:49 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:50 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b10001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:50 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:50.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:51.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:51 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:52 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:52 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b10001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:03:52.216 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:03:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:03:52.216 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:03:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:03:52.217 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:03:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:52.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:53.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:53 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:54 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:54 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:54.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:03:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:55 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:03:55 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:03:55 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:03:55 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:03:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:03:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:55.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:03:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:55 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b10003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:56 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210356 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 16:03:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:56 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:56.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:56 np0005532763 nova_compute[231311]: 2025-11-23 21:03:56.985 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.002 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.002 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.003 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.017 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.018 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.019 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.019 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.020 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.020 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.038 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.038 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.039 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.039 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.040 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:03:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:57 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:03:57 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4236263840' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.535 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.727 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.728 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5250MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.729 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.729 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:03:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:57.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.790 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.790 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:03:57 np0005532763 nova_compute[231311]: 2025-11-23 21:03:57.815 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:03:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:57 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:58 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b10003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:58 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:58 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:03:58 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4197645649' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:03:58 np0005532763 nova_compute[231311]: 2025-11-23 21:03:58.272 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:03:58 np0005532763 nova_compute[231311]: 2025-11-23 21:03:58.279 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:03:58 np0005532763 nova_compute[231311]: 2025-11-23 21:03:58.308 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:03:58 np0005532763 nova_compute[231311]: 2025-11-23 21:03:58.309 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:03:58 np0005532763 nova_compute[231311]: 2025-11-23 21:03:58.309 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.580s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:03:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:03:58.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:58 np0005532763 nova_compute[231311]: 2025-11-23 21:03:58.672 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:03:58 np0005532763 nova_compute[231311]: 2025-11-23 21:03:58.672 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:03:58 np0005532763 nova_compute[231311]: 2025-11-23 21:03:58.673 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:03:58 np0005532763 nova_compute[231311]: 2025-11-23 21:03:58.673 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:03:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:03:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:03:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:03:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:03:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:03:59.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:03:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:03:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:03:59 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:03:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:03:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:00 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:00 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aec000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:00.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:01.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:01 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:04:01 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:04:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:01 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:02 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b1c001340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:02 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:02.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:03.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:03 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aec0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210404 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 16:04:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:04 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:04 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b1c001340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:04 np0005532763 podman[232871]: 2025-11-23 21:04:04.216778692 +0000 UTC m=+0.089721666 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 23 16:04:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:04.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:04:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:05 : epoch 69237693 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:04:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:05.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:05 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:06 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aec0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:06 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:06.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:07.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:07 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b1c001340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:08 : epoch 69237693 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:04:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:08 : epoch 69237693 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:04:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:08 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:08 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aec0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:08.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:09 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:04:09.066 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3a:26:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:d5:4d:db:d5:2b'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:04:09 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:04:09.068 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 23 16:04:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:09 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:04:09.070 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10e3bf57-dd2d-4b94-851f-925bcd297dde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:04:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:09.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:04:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:09 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aec0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:10 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b1c001340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:10 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:10.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:10 : epoch 69237693 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:04:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:11.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:11 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:04:11.842899) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931851842938, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2391, "num_deletes": 251, "total_data_size": 6376138, "memory_usage": 6460384, "flush_reason": "Manual Compaction"}
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Nov 23 16:04:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931851864578, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 4132392, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20890, "largest_seqno": 23276, "table_properties": {"data_size": 4122798, "index_size": 6024, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19855, "raw_average_key_size": 20, "raw_value_size": 4103685, "raw_average_value_size": 4191, "num_data_blocks": 264, "num_entries": 979, "num_filter_entries": 979, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763931631, "oldest_key_time": 1763931631, "file_creation_time": 1763931851, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 21744 microseconds, and 14188 cpu microseconds.
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:04:11.864639) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 4132392 bytes OK
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:04:11.864664) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:04:11.867296) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:04:11.867323) EVENT_LOG_v1 {"time_micros": 1763931851867314, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:04:11.867351) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 6365714, prev total WAL file size 6365714, number of live WAL files 2.
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:04:11.869727) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(4035KB)], [39(12MB)]
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931851869831, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 17511024, "oldest_snapshot_seqno": -1}
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 5439 keys, 15313456 bytes, temperature: kUnknown
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931851961974, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 15313456, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15274761, "index_size": 23993, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13637, "raw_key_size": 137247, "raw_average_key_size": 25, "raw_value_size": 15174010, "raw_average_value_size": 2789, "num_data_blocks": 991, "num_entries": 5439, "num_filter_entries": 5439, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763931851, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:04:11.962345) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 15313456 bytes
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:04:11.963972) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 189.9 rd, 166.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 12.8 +0.0 blob) out(14.6 +0.0 blob), read-write-amplify(7.9) write-amplify(3.7) OK, records in: 5959, records dropped: 520 output_compression: NoCompression
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:04:11.964001) EVENT_LOG_v1 {"time_micros": 1763931851963988, "job": 22, "event": "compaction_finished", "compaction_time_micros": 92224, "compaction_time_cpu_micros": 59583, "output_level": 6, "num_output_files": 1, "total_output_size": 15313456, "num_input_records": 5959, "num_output_records": 5439, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931851965444, "job": 22, "event": "table_file_deletion", "file_number": 41}
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931851969850, "job": 22, "event": "table_file_deletion", "file_number": 39}
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:04:11.869556) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:04:11.969925) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:04:11.969932) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:04:11.969935) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:04:11.969938) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:04:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:04:11.969941) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:04:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:12 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b1c001340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:12 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aec002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:12 np0005532763 podman[232899]: 2025-11-23 21:04:12.273736622 +0000 UTC m=+0.150380587 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 16:04:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:12.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:13.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:13 : epoch 69237693 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 16:04:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:13 : epoch 69237693 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:04:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:13 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aec002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:14 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:14 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b1c009630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:14.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:04:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:15 np0005532763 podman[232929]: 2025-11-23 21:04:15.219918678 +0000 UTC m=+0.093424470 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Nov 23 16:04:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:15.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:15 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aec002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:16 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:16 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:16.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:16 : epoch 69237693 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:04:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:16 : epoch 69237693 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:04:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:16 : epoch 69237693 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:04:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:16 : epoch 69237693 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:04:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:04:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:17.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:04:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:17 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1b1c009630 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:18 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1aec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:18 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1afc004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:18.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:19.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:04:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[232472]: 23/11/2025 21:04:19 : epoch 69237693 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1af8003c10 fd 38 proxy ignored for local
Nov 23 16:04:19 np0005532763 kernel: ganesha.nfsd[232565]: segfault at 50 ip 00007f1bc4dfb32e sp 00007f1b797f9210 error 4 in libntirpc.so.5.8[7f1bc4de0000+2c000] likely on CPU 5 (core 0, socket 5)
Nov 23 16:04:19 np0005532763 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Nov 23 16:04:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:19 np0005532763 systemd[1]: Started Process Core Dump (PID 232955/UID 0).
Nov 23 16:04:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210420 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 16:04:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:20.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:20 np0005532763 systemd-coredump[232956]: Process 232476 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 54:#012#0  0x00007f1bc4dfb32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Nov 23 16:04:21 np0005532763 systemd[1]: systemd-coredump@13-232955-0.service: Deactivated successfully.
Nov 23 16:04:21 np0005532763 systemd[1]: systemd-coredump@13-232955-0.service: Consumed 1.068s CPU time.
Nov 23 16:04:21 np0005532763 podman[232962]: 2025-11-23 21:04:21.063307611 +0000 UTC m=+0.026062082 container died 261f1e1c2ba33cb0b64b03600924312ab55d6f6e98599ee1952871f374e85ba9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:04:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:21 np0005532763 systemd[1]: var-lib-containers-storage-overlay-ee7bbe9237f29dd575aab2a8d26fde60a3d4d12af522b8c612a8302538a35ff5-merged.mount: Deactivated successfully.
Nov 23 16:04:21 np0005532763 podman[232962]: 2025-11-23 21:04:21.118874779 +0000 UTC m=+0.081629180 container remove 261f1e1c2ba33cb0b64b03600924312ab55d6f6e98599ee1952871f374e85ba9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Nov 23 16:04:21 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Main process exited, code=exited, status=139/n/a
Nov 23 16:04:21 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Failed with result 'exit-code'.
Nov 23 16:04:21 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.739s CPU time.
Nov 23 16:04:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:21.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:22.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:23.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:24.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:04:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:25.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210425 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 16:04:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210426 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 1ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 16:04:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:26.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:27.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:28.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:29.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:04:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:30.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:31 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Scheduled restart job, restart counter is at 14.
Nov 23 16:04:31 np0005532763 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 16:04:31 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 1.739s CPU time.
Nov 23 16:04:31 np0005532763 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 16:04:31 np0005532763 podman[233089]: 2025-11-23 21:04:31.635478768 +0000 UTC m=+0.043104730 container create 7921b599458ffbb72715ff70d93a9d72e81e19b2899dc7ee74263bca36278a23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1)
Nov 23 16:04:31 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07061cdc3c556e8090fbe0c3cfc396558d200bcbc400be2482075806b3283578/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Nov 23 16:04:31 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07061cdc3c556e8090fbe0c3cfc396558d200bcbc400be2482075806b3283578/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 16:04:31 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07061cdc3c556e8090fbe0c3cfc396558d200bcbc400be2482075806b3283578/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 16:04:31 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07061cdc3c556e8090fbe0c3cfc396558d200bcbc400be2482075806b3283578/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.dqbktw-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 16:04:31 np0005532763 podman[233089]: 2025-11-23 21:04:31.615083806 +0000 UTC m=+0.022709798 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 16:04:31 np0005532763 podman[233089]: 2025-11-23 21:04:31.712888048 +0000 UTC m=+0.120514090 container init 7921b599458ffbb72715ff70d93a9d72e81e19b2899dc7ee74263bca36278a23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 16:04:31 np0005532763 podman[233089]: 2025-11-23 21:04:31.72365359 +0000 UTC m=+0.131279582 container start 7921b599458ffbb72715ff70d93a9d72e81e19b2899dc7ee74263bca36278a23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:04:31 np0005532763 bash[233089]: 7921b599458ffbb72715ff70d93a9d72e81e19b2899dc7ee74263bca36278a23
Nov 23 16:04:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:31 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Nov 23 16:04:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:31 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Nov 23 16:04:31 np0005532763 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 16:04:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:31.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:31 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Nov 23 16:04:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:31 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Nov 23 16:04:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:31 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Nov 23 16:04:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:31 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Nov 23 16:04:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:31 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Nov 23 16:04:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:31 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:04:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:32.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:33.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:34.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:04:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:35 np0005532763 podman[233150]: 2025-11-23 21:04:35.193108407 +0000 UTC m=+0.076733872 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 16:04:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:35.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:36.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:37.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:37 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:04:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:37 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:04:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:38.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:39.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:04:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:40.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:41.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:42.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:43 np0005532763 podman[233202]: 2025-11-23 21:04:43.051314943 +0000 UTC m=+0.122323390 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118)
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:43.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Nov 23 16:04:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Nov 23 16:04:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:44 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc588000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:44 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:44.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:04:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:45.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:45 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:46 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc574001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:46 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:46 np0005532763 podman[233248]: 2025-11-23 21:04:46.213536517 +0000 UTC m=+0.094009506 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 16:04:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:46.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:47.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210447 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 16:04:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:47 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:48 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5640016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:48 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5640016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:48.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:49.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:04:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:49 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:50 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:50 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5640016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:50.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:51.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:51 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc574001d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:52 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580002bc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:52 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580002bc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:04:52.217 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:04:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:04:52.217 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:04:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:04:52.218 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:04:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:52.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:53.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:53 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5640016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:54 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc574002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:54 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580002bc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:54.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:04:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:55.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:55 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580002bc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:56 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5640016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:56 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc574002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:56 np0005532763 nova_compute[231311]: 2025-11-23 21:04:56.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:04:56 np0005532763 nova_compute[231311]: 2025-11-23 21:04:56.402 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:04:56 np0005532763 nova_compute[231311]: 2025-11-23 21:04:56.402 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:04:56 np0005532763 nova_compute[231311]: 2025-11-23 21:04:56.403 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:04:56 np0005532763 nova_compute[231311]: 2025-11-23 21:04:56.403 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:04:56 np0005532763 nova_compute[231311]: 2025-11-23 21:04:56.404 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:04:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:56.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:56 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:04:56 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2124596384' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:04:56 np0005532763 nova_compute[231311]: 2025-11-23 21:04:56.879 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:04:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:57 np0005532763 nova_compute[231311]: 2025-11-23 21:04:57.114 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:04:57 np0005532763 nova_compute[231311]: 2025-11-23 21:04:57.116 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5205MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:04:57 np0005532763 nova_compute[231311]: 2025-11-23 21:04:57.117 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:04:57 np0005532763 nova_compute[231311]: 2025-11-23 21:04:57.118 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:04:57 np0005532763 nova_compute[231311]: 2025-11-23 21:04:57.179 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:04:57 np0005532763 nova_compute[231311]: 2025-11-23 21:04:57.180 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:04:57 np0005532763 nova_compute[231311]: 2025-11-23 21:04:57.201 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:04:57 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:04:57 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3486645363' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:04:57 np0005532763 nova_compute[231311]: 2025-11-23 21:04:57.686 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:04:57 np0005532763 nova_compute[231311]: 2025-11-23 21:04:57.695 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:04:57 np0005532763 nova_compute[231311]: 2025-11-23 21:04:57.711 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:04:57 np0005532763 nova_compute[231311]: 2025-11-23 21:04:57.713 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:04:57 np0005532763 nova_compute[231311]: 2025-11-23 21:04:57.714 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:04:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:57.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:57 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580002bc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:58 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580002bc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:58 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5640032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:04:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:04:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:04:58.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:04:58 np0005532763 nova_compute[231311]: 2025-11-23 21:04:58.714 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:04:58 np0005532763 nova_compute[231311]: 2025-11-23 21:04:58.714 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:04:58 np0005532763 nova_compute[231311]: 2025-11-23 21:04:58.715 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:04:58 np0005532763 nova_compute[231311]: 2025-11-23 21:04:58.715 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:04:58 np0005532763 nova_compute[231311]: 2025-11-23 21:04:58.728 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:04:58 np0005532763 nova_compute[231311]: 2025-11-23 21:04:58.728 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:04:58 np0005532763 nova_compute[231311]: 2025-11-23 21:04:58.729 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:04:58 np0005532763 nova_compute[231311]: 2025-11-23 21:04:58.729 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:04:58 np0005532763 nova_compute[231311]: 2025-11-23 21:04:58.729 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:04:58 np0005532763 nova_compute[231311]: 2025-11-23 21:04:58.729 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:04:58 np0005532763 nova_compute[231311]: 2025-11-23 21:04:58.730 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:04:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:04:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:59 np0005532763 nova_compute[231311]: 2025-11-23 21:04:59.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:04:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:04:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:04:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:04:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:04:59.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:04:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:04:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:04:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:04:59 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc574002700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:00 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580002bc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:00 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:00.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:01.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:01 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5640032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:01 np0005532763 podman[233450]: 2025-11-23 21:05:01.984989867 +0000 UTC m=+0.152864937 container exec 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Nov 23 16:05:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:02 np0005532763 podman[233450]: 2025-11-23 21:05:02.164791848 +0000 UTC m=+0.332666908 container exec_died 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 16:05:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:02 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740039c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:02 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580002bc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:02 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 23 16:05:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:02.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:02 np0005532763 podman[233570]: 2025-11-23 21:05:02.815210403 +0000 UTC m=+0.075284012 container exec bfa89024a4f3a8c3745fbdf8141ab9c1af6ff603988de647c9e7f7e15dff8638 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 16:05:02 np0005532763 podman[233570]: 2025-11-23 21:05:02.830859032 +0000 UTC m=+0.090932701 container exec_died bfa89024a4f3a8c3745fbdf8141ab9c1af6ff603988de647c9e7f7e15dff8638 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 16:05:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:03 np0005532763 podman[233688]: 2025-11-23 21:05:03.347576238 +0000 UTC m=+0.098651617 container exec 7921b599458ffbb72715ff70d93a9d72e81e19b2899dc7ee74263bca36278a23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Nov 23 16:05:03 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:05:03 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:05:03 np0005532763 podman[233688]: 2025-11-23 21:05:03.368734031 +0000 UTC m=+0.119809330 container exec_died 7921b599458ffbb72715ff70d93a9d72e81e19b2899dc7ee74263bca36278a23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Nov 23 16:05:03 np0005532763 podman[233756]: 2025-11-23 21:05:03.751592055 +0000 UTC m=+0.083630426 container exec 187afc4c1e67339be091cc4caff41c0e2aaba4673fc086f757180d516596ee6c (image=quay.io/ceph/haproxy:2.3, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem)
Nov 23 16:05:03 np0005532763 podman[233756]: 2025-11-23 21:05:03.788701845 +0000 UTC m=+0.120740156 container exec_died 187afc4c1e67339be091cc4caff41c0e2aaba4673fc086f757180d516596ee6c (image=quay.io/ceph/haproxy:2.3, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem)
Nov 23 16:05:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:03.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:03 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:04 np0005532763 podman[233821]: 2025-11-23 21:05:04.11381104 +0000 UTC m=+0.085603061 container exec f83166e24f35928d8e85c6352ec69e598c685dd22eb2d34bc93aec691f658844 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt, release=1793, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Nov 23 16:05:04 np0005532763 podman[233821]: 2025-11-23 21:05:04.179666176 +0000 UTC m=+0.151458147 container exec_died f83166e24f35928d8e85c6352ec69e598c685dd22eb2d34bc93aec691f658844 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, vcs-type=git, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, io.buildah.version=1.28.2, version=2.2.4, description=keepalived for Ceph, name=keepalived, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Keepalived on RHEL 9)
Nov 23 16:05:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:04 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:04 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740039c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:04 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 23 16:05:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:04.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:05:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:05 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:05:05 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:05:05 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 23 16:05:05 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:05:05 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:05:05 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:05:05 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:05:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:05.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:05 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580002bc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:06 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:06 np0005532763 podman[233973]: 2025-11-23 21:05:06.220577544 +0000 UTC m=+0.094163611 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 16:05:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:06 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:06.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:07.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:07 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:08 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580002bc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:08 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580002bc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:08.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:05:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:09.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:09 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580002bc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:10 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:10 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:10.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:05:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:05:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:11.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:11 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580002bc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:12 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580002bc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:12 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:12.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:13 np0005532763 podman[234024]: 2025-11-23 21:05:13.257553677 +0000 UTC m=+0.128206275 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 23 16:05:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:13.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:13 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:14 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:14 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:14.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:05:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:15.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:15 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:16 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:16 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:16.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:17 np0005532763 podman[234055]: 2025-11-23 21:05:17.220699025 +0000 UTC m=+0.098471492 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible)
Nov 23 16:05:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:17.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:17 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210518 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 16:05:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:18 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:18 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:18.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:05:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:19.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:19 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5580016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:20 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:20 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:20.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:21.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:21 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:22 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5580016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:22 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:22.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:23.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:23 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:24 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:24 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5580016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:24.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:05:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:25.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:25 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:26 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:26 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:26.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:27 : epoch 692376df : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:05:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:05:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:27.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:05:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:27 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:28 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:28 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:28.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:05:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:29.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:29 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:30 : epoch 692376df : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:05:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:30 : epoch 692376df : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:05:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:30 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:30 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:30.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:31.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:31 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:32 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:32 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:32.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:33 : epoch 692376df : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 16:05:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:33.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:33 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:34 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:34 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:34.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:05:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:35.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:35 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:36 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:36 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000056s ======
Nov 23 16:05:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:36.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Nov 23 16:05:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:37 np0005532763 podman[234123]: 2025-11-23 21:05:37.208076299 +0000 UTC m=+0.081571068 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 16:05:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:37.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:37 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210538 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 16:05:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:38 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210538 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 143ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 16:05:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:38 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:38.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:05:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:39.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:39 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=404 latency=0.002000057s ======
Nov 23 16:05:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:39.959 +0000] "GET /healthcheck HTTP/1.1" 404 242 - "python-urllib3/1.26.5" - latency=0.002000057s
Nov 23 16:05:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:40 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:40 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc574002250 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:40.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:41.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:41 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:42 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:42 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000056s ======
Nov 23 16:05:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:42.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Nov 23 16:05:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:43.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc574002250 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:44 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:44 np0005532763 podman[234176]: 2025-11-23 21:05:44.250800657 +0000 UTC m=+0.126149838 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:05:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:44 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:44.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:05:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e143 e143: 3 total, 3 up, 3 in
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:05:45.373458) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931945373506, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 1191, "num_deletes": 251, "total_data_size": 2797890, "memory_usage": 2842920, "flush_reason": "Manual Compaction"}
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931945388949, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 1182461, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23281, "largest_seqno": 24467, "table_properties": {"data_size": 1178229, "index_size": 1820, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10920, "raw_average_key_size": 20, "raw_value_size": 1169096, "raw_average_value_size": 2189, "num_data_blocks": 78, "num_entries": 534, "num_filter_entries": 534, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763931853, "oldest_key_time": 1763931853, "file_creation_time": 1763931945, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 15562 microseconds, and 6040 cpu microseconds.
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:05:45.389021) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 1182461 bytes OK
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:05:45.389042) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:05:45.392177) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:05:45.392197) EVENT_LOG_v1 {"time_micros": 1763931945392191, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:05:45.392219) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 2792218, prev total WAL file size 2792218, number of live WAL files 2.
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:05:45.393539) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353033' seq:72057594037927935, type:22 .. '6D67727374617400373535' seq:0, type:0; will stop at (end)
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(1154KB)], [42(14MB)]
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931945393591, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 16495917, "oldest_snapshot_seqno": -1}
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5490 keys, 13036751 bytes, temperature: kUnknown
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931945581213, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 13036751, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13000998, "index_size": 20923, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13765, "raw_key_size": 138606, "raw_average_key_size": 25, "raw_value_size": 12902699, "raw_average_value_size": 2350, "num_data_blocks": 856, "num_entries": 5490, "num_filter_entries": 5490, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763931945, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:05:45.581606) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 13036751 bytes
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:05:45.583400) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 87.8 rd, 69.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 14.6 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(25.0) write-amplify(11.0) OK, records in: 5973, records dropped: 483 output_compression: NoCompression
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:05:45.583437) EVENT_LOG_v1 {"time_micros": 1763931945583420, "job": 24, "event": "compaction_finished", "compaction_time_micros": 187785, "compaction_time_cpu_micros": 47871, "output_level": 6, "num_output_files": 1, "total_output_size": 13036751, "num_input_records": 5973, "num_output_records": 5490, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931945583973, "job": 24, "event": "table_file_deletion", "file_number": 44}
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931945589242, "job": 24, "event": "table_file_deletion", "file_number": 42}
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:05:45.393389) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:05:45.589319) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:05:45.589326) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:05:45.589329) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:05:45.589332) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:05:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:05:45.589335) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:05:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:45.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:45 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:46 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e144 e144: 3 total, 3 up, 3 in
Nov 23 16:05:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:46 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc574002250 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:46 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:46.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:47 : epoch 692376df : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:05:47 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e145 e145: 3 total, 3 up, 3 in
Nov 23 16:05:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:47.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:47 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:48 np0005532763 podman[234208]: 2025-11-23 21:05:48.217477344 +0000 UTC m=+0.092423332 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Nov 23 16:05:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:48 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:48 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580001fd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:48.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e146 e146: 3 total, 3 up, 3 in
Nov 23 16:05:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:05:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:49.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:49 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:50 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:50 : epoch 692376df : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:05:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:50 : epoch 692376df : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:05:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:50 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:50.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:51.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:51 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580001fd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:05:52.218 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:05:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:05:52.219 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:05:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:05:52.219 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:05:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:52 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580001fd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:52 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:52.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:53 : epoch 692376df : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 16:05:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:05:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:53.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:05:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:53 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:54 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580001fd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:54 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:54.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:05:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:55 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e147 e147: 3 total, 3 up, 3 in
Nov 23 16:05:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:55.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:55 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:56 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:56 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580001fd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:56.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:57 np0005532763 nova_compute[231311]: 2025-11-23 21:05:57.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:05:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:57.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:57 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:58 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:58 np0005532763 nova_compute[231311]: 2025-11-23 21:05:58.378 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:05:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:58 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:05:58 np0005532763 nova_compute[231311]: 2025-11-23 21:05:58.393 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:05:58 np0005532763 nova_compute[231311]: 2025-11-23 21:05:58.394 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:05:58 np0005532763 nova_compute[231311]: 2025-11-23 21:05:58.418 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:05:58 np0005532763 nova_compute[231311]: 2025-11-23 21:05:58.418 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:05:58 np0005532763 nova_compute[231311]: 2025-11-23 21:05:58.419 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:05:58 np0005532763 nova_compute[231311]: 2025-11-23 21:05:58.419 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:05:58 np0005532763 nova_compute[231311]: 2025-11-23 21:05:58.419 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:05:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:05:58.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:58 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:05:58 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2345273299' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:05:58 np0005532763 nova_compute[231311]: 2025-11-23 21:05:58.919 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:05:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:05:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:59 np0005532763 nova_compute[231311]: 2025-11-23 21:05:59.176 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:05:59 np0005532763 nova_compute[231311]: 2025-11-23 21:05:59.178 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5247MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:05:59 np0005532763 nova_compute[231311]: 2025-11-23 21:05:59.179 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:05:59 np0005532763 nova_compute[231311]: 2025-11-23 21:05:59.179 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:05:59 np0005532763 nova_compute[231311]: 2025-11-23 21:05:59.254 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:05:59 np0005532763 nova_compute[231311]: 2025-11-23 21:05:59.254 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:05:59 np0005532763 nova_compute[231311]: 2025-11-23 21:05:59.276 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:05:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:05:59 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2552729546' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:05:59 np0005532763 nova_compute[231311]: 2025-11-23 21:05:59.757 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:05:59 np0005532763 nova_compute[231311]: 2025-11-23 21:05:59.765 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:05:59 np0005532763 nova_compute[231311]: 2025-11-23 21:05:59.778 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:05:59 np0005532763 nova_compute[231311]: 2025-11-23 21:05:59.780 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:05:59 np0005532763 nova_compute[231311]: 2025-11-23 21:05:59.781 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:05:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:05:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:05:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:05:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:05:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:05:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:05:59.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:05:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:05:59 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580001fd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210600 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 16:06:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:00 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:00 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:00.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:00 np0005532763 nova_compute[231311]: 2025-11-23 21:06:00.770 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:06:00 np0005532763 nova_compute[231311]: 2025-11-23 21:06:00.771 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:06:00 np0005532763 nova_compute[231311]: 2025-11-23 21:06:00.771 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:06:00 np0005532763 nova_compute[231311]: 2025-11-23 21:06:00.771 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:06:00 np0005532763 nova_compute[231311]: 2025-11-23 21:06:00.788 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:06:00 np0005532763 nova_compute[231311]: 2025-11-23 21:06:00.788 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:06:00 np0005532763 nova_compute[231311]: 2025-11-23 21:06:00.788 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:06:00 np0005532763 nova_compute[231311]: 2025-11-23 21:06:00.789 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:06:00 np0005532763 nova_compute[231311]: 2025-11-23 21:06:00.790 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:06:00 np0005532763 nova_compute[231311]: 2025-11-23 21:06:00.790 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:06:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:01.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:01 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:02 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580003e60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:02 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:02.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:03.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:03 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:04 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:04 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580003e60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:04.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:06:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:05.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:05 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:06 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:06 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:06.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:06:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2701893924' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:06:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:06:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2701893924' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:06:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:07.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:07 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:08 np0005532763 podman[234317]: 2025-11-23 21:06:08.213621101 +0000 UTC m=+0.084010617 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 23 16:06:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:08 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:08 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:08.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:09 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:06:09.033 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3a:26:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:d5:4d:db:d5:2b'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:06:09 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:06:09.034 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 23 16:06:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:06:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:09.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:09 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210610 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 16:06:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:10 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:10 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:10.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:06:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:11.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:06:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:11 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:12 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:12 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:12.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:13.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:13 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:14 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:14 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:06:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:14.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:06:14 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:06:14 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:06:14 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:06:14 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:06:14 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:06:14 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:06:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:06:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:15 np0005532763 podman[234428]: 2025-11-23 21:06:15.255360569 +0000 UTC m=+0.135561494 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 23 16:06:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:15.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:15 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:16 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:06:16.036 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10e3bf57-dd2d-4b94-851f-925bcd297dde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:06:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:16 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:16 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:16.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:17.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:17 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:18 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:18 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:18.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:19 np0005532763 podman[234459]: 2025-11-23 21:06:19.223688539 +0000 UTC m=+0.094805940 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 16:06:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:19 : epoch 692376df : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:06:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:06:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:19.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:19 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:19 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:06:19 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:06:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:20 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:20 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:20.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:21.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:21 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:22 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:22 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:22 : epoch 692376df : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:06:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:22 : epoch 692376df : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:06:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:22.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:23.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:23 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:24 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:24 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:24.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:06:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:25 : epoch 692376df : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Nov 23 16:06:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:25.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:25 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:26 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:26 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:26.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000056s ======
Nov 23 16:06:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:27.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Nov 23 16:06:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:27 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:28 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:28 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:28.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:06:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:29.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:29 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:30 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:30 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:30.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:31.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:31 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210632 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Nov 23 16:06:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:32 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:32 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:06:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:32.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:06:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:33.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:33 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003d90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:34 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:34 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:34.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:06:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:06:35.493365) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931995493407, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 817, "num_deletes": 255, "total_data_size": 1684682, "memory_usage": 1711136, "flush_reason": "Manual Compaction"}
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931995503878, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 1093658, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24472, "largest_seqno": 25284, "table_properties": {"data_size": 1089759, "index_size": 1679, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8609, "raw_average_key_size": 18, "raw_value_size": 1081741, "raw_average_value_size": 2351, "num_data_blocks": 74, "num_entries": 460, "num_filter_entries": 460, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763931946, "oldest_key_time": 1763931946, "file_creation_time": 1763931995, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 10584 microseconds, and 6260 cpu microseconds.
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:06:35.503944) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 1093658 bytes OK
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:06:35.503973) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:06:35.506350) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:06:35.506381) EVENT_LOG_v1 {"time_micros": 1763931995506371, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:06:35.506410) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 1680412, prev total WAL file size 1680412, number of live WAL files 2.
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:06:35.507378) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323531' seq:72057594037927935, type:22 .. '6C6F676D00353032' seq:0, type:0; will stop at (end)
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(1068KB)], [45(12MB)]
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931995507424, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 14130409, "oldest_snapshot_seqno": -1}
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 5423 keys, 13967931 bytes, temperature: kUnknown
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931995616045, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 13967931, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13931316, "index_size": 21977, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13573, "raw_key_size": 138446, "raw_average_key_size": 25, "raw_value_size": 13832875, "raw_average_value_size": 2550, "num_data_blocks": 897, "num_entries": 5423, "num_filter_entries": 5423, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763931995, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:06:35.616483) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 13967931 bytes
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:06:35.618190) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 129.9 rd, 128.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 12.4 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(25.7) write-amplify(12.8) OK, records in: 5950, records dropped: 527 output_compression: NoCompression
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:06:35.618223) EVENT_LOG_v1 {"time_micros": 1763931995618208, "job": 26, "event": "compaction_finished", "compaction_time_micros": 108779, "compaction_time_cpu_micros": 50466, "output_level": 6, "num_output_files": 1, "total_output_size": 13967931, "num_input_records": 5950, "num_output_records": 5423, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931995619054, "job": 26, "event": "table_file_deletion", "file_number": 47}
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763931995624244, "job": 26, "event": "table_file_deletion", "file_number": 45}
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:06:35.507297) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:06:35.624456) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:06:35.624464) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:06:35.624467) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:06:35.624470) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:06:35 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:06:35.624473) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:06:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:35.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:35 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:36 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:36 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:36.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:36 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e148 e148: 3 total, 3 up, 3 in
Nov 23 16:06:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:37 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:37.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:38 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:38 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:38.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:39 np0005532763 podman[234547]: 2025-11-23 21:06:39.205352747 +0000 UTC m=+0.082136027 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 23 16:06:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e149 e149: 3 total, 3 up, 3 in
Nov 23 16:06:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:06:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:39 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:39.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:40 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:40 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:40.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:41 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003e50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:41.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:42 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:42 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c003dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:42.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:43.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:44 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:44 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc574001760 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:44.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:06:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:45 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:45.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:46 np0005532763 podman[234602]: 2025-11-23 21:06:46.2546539 +0000 UTC m=+0.133345173 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 16:06:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:46 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c001080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:46 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:46.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:47 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:47.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:48 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:48 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c001080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:48.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:06:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:49 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc574002ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:49.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:50 np0005532763 podman[234633]: 2025-11-23 21:06:50.222750023 +0000 UTC m=+0.095894630 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 23 16:06:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:50 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:50 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:50 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 e150: 3 total, 3 up, 3 in
Nov 23 16:06:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:50.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:51 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:51.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:06:52.220 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:06:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:06:52.220 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:06:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:06:52.220 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:06:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:52 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc574002ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:52 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:52.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:53 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:53.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:54 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c002fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:54 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc574002ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:54.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:06:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:55 np0005532763 nova_compute[231311]: 2025-11-23 21:06:55.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:06:55 np0005532763 nova_compute[231311]: 2025-11-23 21:06:55.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 23 16:06:55 np0005532763 nova_compute[231311]: 2025-11-23 21:06:55.412 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 23 16:06:55 np0005532763 nova_compute[231311]: 2025-11-23 21:06:55.413 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:06:55 np0005532763 nova_compute[231311]: 2025-11-23 21:06:55.413 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 23 16:06:55 np0005532763 nova_compute[231311]: 2025-11-23 21:06:55.432 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:06:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:55 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:55.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:56 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:56 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c0038e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:56.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:06:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:57 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc574002ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:57.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:58 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:58 np0005532763 nova_compute[231311]: 2025-11-23 21:06:58.453 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:06:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:58 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:06:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:06:58.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:06:58 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Nov 23 16:06:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:06:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:59 np0005532763 nova_compute[231311]: 2025-11-23 21:06:59.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:06:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:06:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:06:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:06:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:06:59 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c0038e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:06:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:06:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:06:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:06:59.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:00 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc574002ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:00 np0005532763 nova_compute[231311]: 2025-11-23 21:07:00.380 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:07:00 np0005532763 nova_compute[231311]: 2025-11-23 21:07:00.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:07:00 np0005532763 nova_compute[231311]: 2025-11-23 21:07:00.382 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:07:00 np0005532763 nova_compute[231311]: 2025-11-23 21:07:00.382 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:07:00 np0005532763 nova_compute[231311]: 2025-11-23 21:07:00.411 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:07:00 np0005532763 nova_compute[231311]: 2025-11-23 21:07:00.412 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:07:00 np0005532763 nova_compute[231311]: 2025-11-23 21:07:00.413 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:07:00 np0005532763 nova_compute[231311]: 2025-11-23 21:07:00.435 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:00 np0005532763 nova_compute[231311]: 2025-11-23 21:07:00.436 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:00 np0005532763 nova_compute[231311]: 2025-11-23 21:07:00.437 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:00 np0005532763 nova_compute[231311]: 2025-11-23 21:07:00.437 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:07:00 np0005532763 nova_compute[231311]: 2025-11-23 21:07:00.438 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:07:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:00 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:00.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:00 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:07:00 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/705310154' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:07:00 np0005532763 nova_compute[231311]: 2025-11-23 21:07:00.955 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:07:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:01 np0005532763 nova_compute[231311]: 2025-11-23 21:07:01.197 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:07:01 np0005532763 nova_compute[231311]: 2025-11-23 21:07:01.199 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5225MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:07:01 np0005532763 nova_compute[231311]: 2025-11-23 21:07:01.199 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:01 np0005532763 nova_compute[231311]: 2025-11-23 21:07:01.199 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:01 np0005532763 nova_compute[231311]: 2025-11-23 21:07:01.315 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:07:01 np0005532763 nova_compute[231311]: 2025-11-23 21:07:01.316 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:07:01 np0005532763 nova_compute[231311]: 2025-11-23 21:07:01.374 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Refreshing inventories for resource provider 20c32e0a-de2c-427c-9273-fac11e2660f4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 23 16:07:01 np0005532763 nova_compute[231311]: 2025-11-23 21:07:01.436 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Updating ProviderTree inventory for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 23 16:07:01 np0005532763 nova_compute[231311]: 2025-11-23 21:07:01.437 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Updating inventory in ProviderTree for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 23 16:07:01 np0005532763 nova_compute[231311]: 2025-11-23 21:07:01.461 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Refreshing aggregate associations for resource provider 20c32e0a-de2c-427c-9273-fac11e2660f4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 23 16:07:01 np0005532763 nova_compute[231311]: 2025-11-23 21:07:01.482 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Refreshing trait associations for resource provider 20c32e0a-de2c-427c-9273-fac11e2660f4, traits: COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AVX2,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_FMA3,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,HW_CPU_X86_SVM,HW_CPU_X86_BMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 23 16:07:01 np0005532763 nova_compute[231311]: 2025-11-23 21:07:01.513 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:07:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:01 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:07:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:01.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:07:01 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:07:01 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3645982558' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:07:01 np0005532763 nova_compute[231311]: 2025-11-23 21:07:01.981 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:07:01 np0005532763 nova_compute[231311]: 2025-11-23 21:07:01.989 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:07:02 np0005532763 nova_compute[231311]: 2025-11-23 21:07:02.011 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:07:02 np0005532763 nova_compute[231311]: 2025-11-23 21:07:02.014 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:07:02 np0005532763 nova_compute[231311]: 2025-11-23 21:07:02.015 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.815s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:02 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:02 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:02.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:02 np0005532763 nova_compute[231311]: 2025-11-23 21:07:02.986 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:07:02 np0005532763 nova_compute[231311]: 2025-11-23 21:07:02.986 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:07:02 np0005532763 nova_compute[231311]: 2025-11-23 21:07:02.987 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:07:02 np0005532763 nova_compute[231311]: 2025-11-23 21:07:02.987 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:07:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:03 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:03.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:04 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:04 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:04.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:07:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:05 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:05.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:06 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564003e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:06 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:07:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:06.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:07:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:07:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1035460735' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:07:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:07:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1035460735' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:07:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:07 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:07.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:08 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:08 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564004010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:08.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:07:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:09 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:07:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:09.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:07:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:10 np0005532763 podman[234742]: 2025-11-23 21:07:10.206033244 +0000 UTC m=+0.086203820 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Nov 23 16:07:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:10 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:10 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:10.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:11 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:11.219 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3a:26:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:d5:4d:db:d5:2b'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:07:11 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:11.221 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 23 16:07:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:11 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564004030 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:11.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:12 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:12 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:12.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:13 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:13.223 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10e3bf57-dd2d-4b94-851f-925bcd297dde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:07:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:13 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:07:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:13.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:07:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:14 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:14 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:14.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:07:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:15 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc57c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:15.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:16 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:16 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558001070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:16.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:17 np0005532763 podman[234771]: 2025-11-23 21:07:17.239106813 +0000 UTC m=+0.114789586 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 16:07:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:17 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:17.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:18 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:18 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:18.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:18 np0005532763 nova_compute[231311]: 2025-11-23 21:07:18.821 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:18 np0005532763 nova_compute[231311]: 2025-11-23 21:07:18.822 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:18 np0005532763 nova_compute[231311]: 2025-11-23 21:07:18.838 231315 DEBUG nova.compute.manager [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 23 16:07:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:18 np0005532763 nova_compute[231311]: 2025-11-23 21:07:18.915 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:18 np0005532763 nova_compute[231311]: 2025-11-23 21:07:18.916 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:18 np0005532763 nova_compute[231311]: 2025-11-23 21:07:18.924 231315 DEBUG nova.virt.hardware [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 23 16:07:18 np0005532763 nova_compute[231311]: 2025-11-23 21:07:18.924 231315 INFO nova.compute.claims [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.038 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:07:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:07:19 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1983814822' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.512 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.523 231315 DEBUG nova.compute.provider_tree [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.537 231315 DEBUG nova.scheduler.client.report [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.563 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.647s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.564 231315 DEBUG nova.compute.manager [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.631 231315 DEBUG nova.compute.manager [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.632 231315 DEBUG nova.network.neutron [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.686 231315 INFO nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.708 231315 DEBUG nova.compute.manager [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.809 231315 DEBUG nova.compute.manager [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.811 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.811 231315 INFO nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Creating image(s)#033[00m
Nov 23 16:07:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.853 231315 DEBUG nova.storage.rbd_utils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image cc03c89f-bbbe-477a-ad7c-2f31c9125d20_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:07:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.894 231315 DEBUG nova.storage.rbd_utils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image cc03c89f-bbbe-477a-ad7c-2f31c9125d20_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.932 231315 DEBUG nova.storage.rbd_utils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image cc03c89f-bbbe-477a-ad7c-2f31c9125d20_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.942 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:19 np0005532763 nova_compute[231311]: 2025-11-23 21:07:19.943 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:19 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558001070 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:19.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:20 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:20 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:20 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:07:20 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:07:20 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:07:20 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:07:20 np0005532763 nova_compute[231311]: 2025-11-23 21:07:20.580 231315 WARNING oslo_policy.policy [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Nov 23 16:07:20 np0005532763 nova_compute[231311]: 2025-11-23 21:07:20.580 231315 WARNING oslo_policy.policy [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Nov 23 16:07:20 np0005532763 nova_compute[231311]: 2025-11-23 21:07:20.583 231315 DEBUG nova.policy [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9fb5352c62684f2ba3a326a953a10dfe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '782593db60784ab8bff41fe87d72ff5f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 23 16:07:20 np0005532763 nova_compute[231311]: 2025-11-23 21:07:20.641 231315 DEBUG nova.virt.libvirt.imagebackend [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Image locations are: [{'url': 'rbd://03808be8-ae4a-5548-82e6-4a294f1bc627/images/3c45fa6c-8a99-4359-a34e-d89f4e1e77d0/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://03808be8-ae4a-5548-82e6-4a294f1bc627/images/3c45fa6c-8a99-4359-a34e-d89f4e1e77d0/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Nov 23 16:07:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:20.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:21 np0005532763 podman[234959]: 2025-11-23 21:07:21.247610871 +0000 UTC m=+0.122074599 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 23 16:07:21 np0005532763 nova_compute[231311]: 2025-11-23 21:07:21.443 231315 DEBUG nova.network.neutron [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Successfully created port: fa39fedb-0393-4e6b-a380-50741abeeb9d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 23 16:07:21 np0005532763 nova_compute[231311]: 2025-11-23 21:07:21.546 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:07:21 np0005532763 nova_compute[231311]: 2025-11-23 21:07:21.634 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56.part --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:07:21 np0005532763 nova_compute[231311]: 2025-11-23 21:07:21.636 231315 DEBUG nova.virt.images [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] 3c45fa6c-8a99-4359-a34e-d89f4e1e77d0 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 23 16:07:21 np0005532763 nova_compute[231311]: 2025-11-23 21:07:21.638 231315 DEBUG nova.privsep.utils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 23 16:07:21 np0005532763 nova_compute[231311]: 2025-11-23 21:07:21.639 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56.part /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:07:21 np0005532763 nova_compute[231311]: 2025-11-23 21:07:21.887 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56.part /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56.converted" returned: 0 in 0.248s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:07:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:21 np0005532763 nova_compute[231311]: 2025-11-23 21:07:21.895 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:07:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:21 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:07:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:21.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:07:21 np0005532763 nova_compute[231311]: 2025-11-23 21:07:21.982 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56.converted --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:07:21 np0005532763 nova_compute[231311]: 2025-11-23 21:07:21.984 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.019 231315 DEBUG nova.storage.rbd_utils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image cc03c89f-bbbe-477a-ad7c-2f31c9125d20_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.024 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 cc03c89f-bbbe-477a-ad7c-2f31c9125d20_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:07:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:22 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.420 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 cc03c89f-bbbe-477a-ad7c-2f31c9125d20_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.397s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:07:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:22 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.535 231315 DEBUG nova.storage.rbd_utils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] resizing rbd image cc03c89f-bbbe-477a-ad7c-2f31c9125d20_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.586 231315 DEBUG nova.network.neutron [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Successfully updated port: fa39fedb-0393-4e6b-a380-50741abeeb9d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.606 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "refresh_cache-cc03c89f-bbbe-477a-ad7c-2f31c9125d20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.606 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquired lock "refresh_cache-cc03c89f-bbbe-477a-ad7c-2f31c9125d20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.607 231315 DEBUG nova.network.neutron [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 23 16:07:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:07:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:22.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.685 231315 DEBUG nova.compute.manager [req-1f99c711-fc99-44e8-be7d-3e298bc275a1 req-d7cd73e5-c92c-4af1-bc2a-756394ce2a60 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Received event network-changed-fa39fedb-0393-4e6b-a380-50741abeeb9d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.686 231315 DEBUG nova.compute.manager [req-1f99c711-fc99-44e8-be7d-3e298bc275a1 req-d7cd73e5-c92c-4af1-bc2a-756394ce2a60 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Refreshing instance network info cache due to event network-changed-fa39fedb-0393-4e6b-a380-50741abeeb9d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.687 231315 DEBUG oslo_concurrency.lockutils [req-1f99c711-fc99-44e8-be7d-3e298bc275a1 req-d7cd73e5-c92c-4af1-bc2a-756394ce2a60 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "refresh_cache-cc03c89f-bbbe-477a-ad7c-2f31c9125d20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.693 231315 DEBUG nova.objects.instance [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'migration_context' on Instance uuid cc03c89f-bbbe-477a-ad7c-2f31c9125d20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.713 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.713 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Ensure instance console log exists: /var/lib/nova/instances/cc03c89f-bbbe-477a-ad7c-2f31c9125d20/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.714 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.715 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.715 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:22 np0005532763 nova_compute[231311]: 2025-11-23 21:07:22.726 231315 DEBUG nova.network.neutron [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 23 16:07:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.452 231315 DEBUG nova.network.neutron [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Updating instance_info_cache with network_info: [{"id": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "address": "fa:16:3e:6c:43:c1", "network": {"id": "4330ece9-d3f3-4995-b96e-abf6bfccd4cc", "bridge": "br-int", "label": "tempest-network-smoke--390742040", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa39fedb-03", "ovs_interfaceid": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.472 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Releasing lock "refresh_cache-cc03c89f-bbbe-477a-ad7c-2f31c9125d20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.473 231315 DEBUG nova.compute.manager [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Instance network_info: |[{"id": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "address": "fa:16:3e:6c:43:c1", "network": {"id": "4330ece9-d3f3-4995-b96e-abf6bfccd4cc", "bridge": "br-int", "label": "tempest-network-smoke--390742040", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa39fedb-03", "ovs_interfaceid": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.473 231315 DEBUG oslo_concurrency.lockutils [req-1f99c711-fc99-44e8-be7d-3e298bc275a1 req-d7cd73e5-c92c-4af1-bc2a-756394ce2a60 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquired lock "refresh_cache-cc03c89f-bbbe-477a-ad7c-2f31c9125d20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.474 231315 DEBUG nova.network.neutron [req-1f99c711-fc99-44e8-be7d-3e298bc275a1 req-d7cd73e5-c92c-4af1-bc2a-756394ce2a60 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Refreshing network info cache for port fa39fedb-0393-4e6b-a380-50741abeeb9d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.479 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Start _get_guest_xml network_info=[{"id": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "address": "fa:16:3e:6c:43:c1", "network": {"id": "4330ece9-d3f3-4995-b96e-abf6bfccd4cc", "bridge": "br-int", "label": "tempest-network-smoke--390742040", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa39fedb-03", "ovs_interfaceid": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-23T21:05:43Z,direct_url=<?>,disk_format='qcow2',id=3c45fa6c-8a99-4359-a34e-d89f4e1e77d0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3f8fb5175f85402ba20cf9c6989d47cf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-23T21:05:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'encryption_options': None, 'size': 0, 'image_id': '3c45fa6c-8a99-4359-a34e-d89f4e1e77d0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.485 231315 WARNING nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.497 231315 DEBUG nova.virt.libvirt.host [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.498 231315 DEBUG nova.virt.libvirt.host [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.502 231315 DEBUG nova.virt.libvirt.host [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.503 231315 DEBUG nova.virt.libvirt.host [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.504 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.504 231315 DEBUG nova.virt.hardware [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-23T21:05:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='56044b93-2979-48aa-b67f-c37e1b489306',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-23T21:05:43Z,direct_url=<?>,disk_format='qcow2',id=3c45fa6c-8a99-4359-a34e-d89f4e1e77d0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3f8fb5175f85402ba20cf9c6989d47cf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-23T21:05:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.505 231315 DEBUG nova.virt.hardware [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.505 231315 DEBUG nova.virt.hardware [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.505 231315 DEBUG nova.virt.hardware [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.505 231315 DEBUG nova.virt.hardware [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.506 231315 DEBUG nova.virt.hardware [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.506 231315 DEBUG nova.virt.hardware [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.506 231315 DEBUG nova.virt.hardware [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.507 231315 DEBUG nova.virt.hardware [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.507 231315 DEBUG nova.virt.hardware [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.507 231315 DEBUG nova.virt.hardware [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.513 231315 DEBUG nova.privsep.utils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.514 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:07:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:23 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:23 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 16:07:23 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2482210232' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Nov 23 16:07:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:23.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:23 np0005532763 nova_compute[231311]: 2025-11-23 21:07:23.989 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.027 231315 DEBUG nova.storage.rbd_utils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image cc03c89f-bbbe-477a-ad7c-2f31c9125d20_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.033 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:07:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:24 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558001dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 16:07:24 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2495490603' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Nov 23 16:07:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:24 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.511 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.514 231315 DEBUG nova.virt.libvirt.vif [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T21:07:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-749436504',display_name='tempest-TestNetworkBasicOps-server-749436504',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-749436504',id=2,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKi8custUVsiaaq6kl1F/UzCBdG1D+LU2LLwnWBAVFFHysn/MJX0KH/er5rYGFW/a70JpuJYuPjgSDIy48bKBG98pVDfg3bXEvJWC00N0L2Ff6HHmtCb6nNrW876ZSacXA==',key_name='tempest-TestNetworkBasicOps-1109475132',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-10j1rhls',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-23T21:07:19Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=cc03c89f-bbbe-477a-ad7c-2f31c9125d20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "address": "fa:16:3e:6c:43:c1", "network": {"id": "4330ece9-d3f3-4995-b96e-abf6bfccd4cc", "bridge": "br-int", "label": "tempest-network-smoke--390742040", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa39fedb-03", "ovs_interfaceid": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.515 231315 DEBUG nova.network.os_vif_util [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "address": "fa:16:3e:6c:43:c1", "network": {"id": "4330ece9-d3f3-4995-b96e-abf6bfccd4cc", "bridge": "br-int", "label": "tempest-network-smoke--390742040", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa39fedb-03", "ovs_interfaceid": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.516 231315 DEBUG nova.network.os_vif_util [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:43:c1,bridge_name='br-int',has_traffic_filtering=True,id=fa39fedb-0393-4e6b-a380-50741abeeb9d,network=Network(4330ece9-d3f3-4995-b96e-abf6bfccd4cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa39fedb-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.520 231315 DEBUG nova.objects.instance [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'pci_devices' on Instance uuid cc03c89f-bbbe-477a-ad7c-2f31c9125d20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:07:24 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:07:24 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.535 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] End _get_guest_xml xml=<domain type="kvm">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  <uuid>cc03c89f-bbbe-477a-ad7c-2f31c9125d20</uuid>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  <name>instance-00000002</name>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  <memory>131072</memory>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  <vcpu>1</vcpu>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  <metadata>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <nova:name>tempest-TestNetworkBasicOps-server-749436504</nova:name>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <nova:creationTime>2025-11-23 21:07:23</nova:creationTime>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <nova:flavor name="m1.nano">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        <nova:memory>128</nova:memory>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        <nova:disk>1</nova:disk>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        <nova:swap>0</nova:swap>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        <nova:ephemeral>0</nova:ephemeral>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        <nova:vcpus>1</nova:vcpus>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      </nova:flavor>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <nova:owner>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        <nova:user uuid="9fb5352c62684f2ba3a326a953a10dfe">tempest-TestNetworkBasicOps-1975357669-project-member</nova:user>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        <nova:project uuid="782593db60784ab8bff41fe87d72ff5f">tempest-TestNetworkBasicOps-1975357669</nova:project>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      </nova:owner>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <nova:root type="image" uuid="3c45fa6c-8a99-4359-a34e-d89f4e1e77d0"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <nova:ports>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        <nova:port uuid="fa39fedb-0393-4e6b-a380-50741abeeb9d">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:          <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        </nova:port>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      </nova:ports>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    </nova:instance>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  </metadata>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  <sysinfo type="smbios">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <system>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <entry name="manufacturer">RDO</entry>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <entry name="product">OpenStack Compute</entry>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <entry name="serial">cc03c89f-bbbe-477a-ad7c-2f31c9125d20</entry>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <entry name="uuid">cc03c89f-bbbe-477a-ad7c-2f31c9125d20</entry>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <entry name="family">Virtual Machine</entry>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    </system>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  </sysinfo>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  <os>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <boot dev="hd"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <smbios mode="sysinfo"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  </os>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  <features>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <acpi/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <apic/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <vmcoreinfo/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  </features>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  <clock offset="utc">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <timer name="pit" tickpolicy="delay"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <timer name="hpet" present="no"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  </clock>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  <cpu mode="host-model" match="exact">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <topology sockets="1" cores="1" threads="1"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  </cpu>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  <devices>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <disk type="network" device="disk">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <driver type="raw" cache="none"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <source protocol="rbd" name="vms/cc03c89f-bbbe-477a-ad7c-2f31c9125d20_disk">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        <host name="192.168.122.100" port="6789"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        <host name="192.168.122.102" port="6789"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        <host name="192.168.122.101" port="6789"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      </source>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <auth username="openstack">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        <secret type="ceph" uuid="03808be8-ae4a-5548-82e6-4a294f1bc627"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      </auth>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <target dev="vda" bus="virtio"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    </disk>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <disk type="network" device="cdrom">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <driver type="raw" cache="none"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <source protocol="rbd" name="vms/cc03c89f-bbbe-477a-ad7c-2f31c9125d20_disk.config">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        <host name="192.168.122.100" port="6789"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        <host name="192.168.122.102" port="6789"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        <host name="192.168.122.101" port="6789"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      </source>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <auth username="openstack">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:        <secret type="ceph" uuid="03808be8-ae4a-5548-82e6-4a294f1bc627"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      </auth>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <target dev="sda" bus="sata"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    </disk>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <interface type="ethernet">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <mac address="fa:16:3e:6c:43:c1"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <model type="virtio"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <driver name="vhost" rx_queue_size="512"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <mtu size="1442"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <target dev="tapfa39fedb-03"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    </interface>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <serial type="pty">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <log file="/var/lib/nova/instances/cc03c89f-bbbe-477a-ad7c-2f31c9125d20/console.log" append="off"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    </serial>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <video>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <model type="virtio"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    </video>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <input type="tablet" bus="usb"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <rng model="virtio">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <backend model="random">/dev/urandom</backend>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    </rng>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <controller type="usb" index="0"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    <memballoon model="virtio">
Nov 23 16:07:24 np0005532763 nova_compute[231311]:      <stats period="10"/>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:    </memballoon>
Nov 23 16:07:24 np0005532763 nova_compute[231311]:  </devices>
Nov 23 16:07:24 np0005532763 nova_compute[231311]: </domain>
Nov 23 16:07:24 np0005532763 nova_compute[231311]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.536 231315 DEBUG nova.compute.manager [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Preparing to wait for external event network-vif-plugged-fa39fedb-0393-4e6b-a380-50741abeeb9d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.537 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.537 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.537 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.538 231315 DEBUG nova.virt.libvirt.vif [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T21:07:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-749436504',display_name='tempest-TestNetworkBasicOps-server-749436504',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-749436504',id=2,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKi8custUVsiaaq6kl1F/UzCBdG1D+LU2LLwnWBAVFFHysn/MJX0KH/er5rYGFW/a70JpuJYuPjgSDIy48bKBG98pVDfg3bXEvJWC00N0L2Ff6HHmtCb6nNrW876ZSacXA==',key_name='tempest-TestNetworkBasicOps-1109475132',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-10j1rhls',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-23T21:07:19Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=cc03c89f-bbbe-477a-ad7c-2f31c9125d20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "address": "fa:16:3e:6c:43:c1", "network": {"id": "4330ece9-d3f3-4995-b96e-abf6bfccd4cc", "bridge": "br-int", "label": "tempest-network-smoke--390742040", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa39fedb-03", "ovs_interfaceid": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.539 231315 DEBUG nova.network.os_vif_util [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "address": "fa:16:3e:6c:43:c1", "network": {"id": "4330ece9-d3f3-4995-b96e-abf6bfccd4cc", "bridge": "br-int", "label": "tempest-network-smoke--390742040", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa39fedb-03", "ovs_interfaceid": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.540 231315 DEBUG nova.network.os_vif_util [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:43:c1,bridge_name='br-int',has_traffic_filtering=True,id=fa39fedb-0393-4e6b-a380-50741abeeb9d,network=Network(4330ece9-d3f3-4995-b96e-abf6bfccd4cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa39fedb-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.540 231315 DEBUG os_vif [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:43:c1,bridge_name='br-int',has_traffic_filtering=True,id=fa39fedb-0393-4e6b-a380-50741abeeb9d,network=Network(4330ece9-d3f3-4995-b96e-abf6bfccd4cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa39fedb-03') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.594 231315 DEBUG ovsdbapp.backend.ovs_idl [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.594 231315 DEBUG ovsdbapp.backend.ovs_idl [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.595 231315 DEBUG ovsdbapp.backend.ovs_idl [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.595 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.596 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [POLLOUT] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.596 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.597 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.599 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.602 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.618 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.618 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.619 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 23 16:07:24 np0005532763 nova_compute[231311]: 2025-11-23 21:07:24.620 231315 INFO oslo.privsep.daemon [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpinqgrqcl/privsep.sock']#033[00m
Nov 23 16:07:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:24.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:07:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.294 231315 INFO oslo.privsep.daemon [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Spawned new privsep daemon via rootwrap#033[00m
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.174 235226 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.180 235226 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.183 235226 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.183 235226 INFO oslo.privsep.daemon [-] privsep daemon running as pid 235226#033[00m
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.359 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.645 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.646 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa39fedb-03, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.647 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfa39fedb-03, col_values=(('external_ids', {'iface-id': 'fa39fedb-0393-4e6b-a380-50741abeeb9d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6c:43:c1', 'vm-uuid': 'cc03c89f-bbbe-477a-ad7c-2f31c9125d20'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.649 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:25 np0005532763 NetworkManager[48849]: <info>  [1763932045.6510] manager: (tapfa39fedb-03): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.653 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.659 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.660 231315 INFO os_vif [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:43:c1,bridge_name='br-int',has_traffic_filtering=True,id=fa39fedb-0393-4e6b-a380-50741abeeb9d,network=Network(4330ece9-d3f3-4995-b96e-abf6bfccd4cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa39fedb-03')#033[00m
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.713 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.713 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.714 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No VIF found with MAC fa:16:3e:6c:43:c1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.715 231315 INFO nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Using config drive#033[00m
Nov 23 16:07:25 np0005532763 nova_compute[231311]: 2025-11-23 21:07:25.750 231315 DEBUG nova.storage.rbd_utils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image cc03c89f-bbbe-477a-ad7c-2f31c9125d20_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:07:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:25 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc580004780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:25.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:26 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:26 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558001dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:26.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:26 np0005532763 nova_compute[231311]: 2025-11-23 21:07:26.981 231315 INFO nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Creating config drive at /var/lib/nova/instances/cc03c89f-bbbe-477a-ad7c-2f31c9125d20/disk.config#033[00m
Nov 23 16:07:26 np0005532763 nova_compute[231311]: 2025-11-23 21:07:26.990 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cc03c89f-bbbe-477a-ad7c-2f31c9125d20/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp94pmk0zn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:07:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:27 np0005532763 nova_compute[231311]: 2025-11-23 21:07:27.133 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cc03c89f-bbbe-477a-ad7c-2f31c9125d20/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp94pmk0zn" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:07:27 np0005532763 nova_compute[231311]: 2025-11-23 21:07:27.181 231315 DEBUG nova.storage.rbd_utils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image cc03c89f-bbbe-477a-ad7c-2f31c9125d20_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:07:27 np0005532763 nova_compute[231311]: 2025-11-23 21:07:27.186 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/cc03c89f-bbbe-477a-ad7c-2f31c9125d20/disk.config cc03c89f-bbbe-477a-ad7c-2f31c9125d20_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:07:27 np0005532763 nova_compute[231311]: 2025-11-23 21:07:27.394 231315 DEBUG oslo_concurrency.processutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/cc03c89f-bbbe-477a-ad7c-2f31c9125d20/disk.config cc03c89f-bbbe-477a-ad7c-2f31c9125d20_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.208s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:07:27 np0005532763 nova_compute[231311]: 2025-11-23 21:07:27.396 231315 INFO nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Deleting local config drive /var/lib/nova/instances/cc03c89f-bbbe-477a-ad7c-2f31c9125d20/disk.config because it was imported into RBD.#033[00m
Nov 23 16:07:27 np0005532763 systemd[1]: Starting libvirt secret daemon...
Nov 23 16:07:27 np0005532763 systemd[1]: Started libvirt secret daemon.
Nov 23 16:07:27 np0005532763 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 23 16:07:27 np0005532763 kernel: tapfa39fedb-03: entered promiscuous mode
Nov 23 16:07:27 np0005532763 NetworkManager[48849]: <info>  [1763932047.5509] manager: (tapfa39fedb-03): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Nov 23 16:07:27 np0005532763 nova_compute[231311]: 2025-11-23 21:07:27.585 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:27 np0005532763 ovn_controller[133425]: 2025-11-23T21:07:27Z|00027|binding|INFO|Claiming lport fa39fedb-0393-4e6b-a380-50741abeeb9d for this chassis.
Nov 23 16:07:27 np0005532763 ovn_controller[133425]: 2025-11-23T21:07:27Z|00028|binding|INFO|fa39fedb-0393-4e6b-a380-50741abeeb9d: Claiming fa:16:3e:6c:43:c1 10.100.0.29
Nov 23 16:07:27 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:27.601 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:43:c1 10.100.0.29'], port_security=['fa:16:3e:6c:43:c1 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': 'cc03c89f-bbbe-477a-ad7c-2f31c9125d20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4330ece9-d3f3-4995-b96e-abf6bfccd4cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782593db60784ab8bff41fe87d72ff5f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '69d27e0d-1270-462f-9117-bfea049ad9d0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e0d310c-ba36-446f-b044-c8e4856a70c5, chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], logical_port=fa39fedb-0393-4e6b-a380-50741abeeb9d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:07:27 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:27.602 142920 INFO neutron.agent.ovn.metadata.agent [-] Port fa39fedb-0393-4e6b-a380-50741abeeb9d in datapath 4330ece9-d3f3-4995-b96e-abf6bfccd4cc bound to our chassis#033[00m
Nov 23 16:07:27 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:27.605 142920 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4330ece9-d3f3-4995-b96e-abf6bfccd4cc#033[00m
Nov 23 16:07:27 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:27.606 142920 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpv6y3gocv/privsep.sock']#033[00m
Nov 23 16:07:27 np0005532763 systemd-udevd[235326]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 16:07:27 np0005532763 NetworkManager[48849]: <info>  [1763932047.6331] device (tapfa39fedb-03): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 23 16:07:27 np0005532763 NetworkManager[48849]: <info>  [1763932047.6346] device (tapfa39fedb-03): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 23 16:07:27 np0005532763 systemd-machined[194484]: New machine qemu-1-instance-00000002.
Nov 23 16:07:27 np0005532763 nova_compute[231311]: 2025-11-23 21:07:27.675 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:27 np0005532763 ovn_controller[133425]: 2025-11-23T21:07:27Z|00029|binding|INFO|Setting lport fa39fedb-0393-4e6b-a380-50741abeeb9d ovn-installed in OVS
Nov 23 16:07:27 np0005532763 ovn_controller[133425]: 2025-11-23T21:07:27Z|00030|binding|INFO|Setting lport fa39fedb-0393-4e6b-a380-50741abeeb9d up in Southbound
Nov 23 16:07:27 np0005532763 nova_compute[231311]: 2025-11-23 21:07:27.681 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:27 np0005532763 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Nov 23 16:07:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:27 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:27.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:27 np0005532763 nova_compute[231311]: 2025-11-23 21:07:27.984 231315 DEBUG nova.network.neutron [req-1f99c711-fc99-44e8-be7d-3e298bc275a1 req-d7cd73e5-c92c-4af1-bc2a-756394ce2a60 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Updated VIF entry in instance network info cache for port fa39fedb-0393-4e6b-a380-50741abeeb9d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 23 16:07:27 np0005532763 nova_compute[231311]: 2025-11-23 21:07:27.984 231315 DEBUG nova.network.neutron [req-1f99c711-fc99-44e8-be7d-3e298bc275a1 req-d7cd73e5-c92c-4af1-bc2a-756394ce2a60 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Updating instance_info_cache with network_info: [{"id": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "address": "fa:16:3e:6c:43:c1", "network": {"id": "4330ece9-d3f3-4995-b96e-abf6bfccd4cc", "bridge": "br-int", "label": "tempest-network-smoke--390742040", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa39fedb-03", "ovs_interfaceid": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.006 231315 DEBUG oslo_concurrency.lockutils [req-1f99c711-fc99-44e8-be7d-3e298bc275a1 req-d7cd73e5-c92c-4af1-bc2a-756394ce2a60 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Releasing lock "refresh_cache-cc03c89f-bbbe-477a-ad7c-2f31c9125d20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:07:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.144 231315 DEBUG nova.virt.driver [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Emitting event <LifecycleEvent: 1763932048.14417, cc03c89f-bbbe-477a-ad7c-2f31c9125d20 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.145 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] VM Started (Lifecycle Event)#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.152 231315 DEBUG nova.compute.manager [req-9c5b042c-fbb3-40b7-b9b4-00bf0d8fd946 req-a493dbc1-5829-4ed1-bc79-b09285b9a690 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Received event network-vif-plugged-fa39fedb-0393-4e6b-a380-50741abeeb9d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.153 231315 DEBUG oslo_concurrency.lockutils [req-9c5b042c-fbb3-40b7-b9b4-00bf0d8fd946 req-a493dbc1-5829-4ed1-bc79-b09285b9a690 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.153 231315 DEBUG oslo_concurrency.lockutils [req-9c5b042c-fbb3-40b7-b9b4-00bf0d8fd946 req-a493dbc1-5829-4ed1-bc79-b09285b9a690 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.153 231315 DEBUG oslo_concurrency.lockutils [req-9c5b042c-fbb3-40b7-b9b4-00bf0d8fd946 req-a493dbc1-5829-4ed1-bc79-b09285b9a690 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.154 231315 DEBUG nova.compute.manager [req-9c5b042c-fbb3-40b7-b9b4-00bf0d8fd946 req-a493dbc1-5829-4ed1-bc79-b09285b9a690 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Processing event network-vif-plugged-fa39fedb-0393-4e6b-a380-50741abeeb9d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.155 231315 DEBUG nova.compute.manager [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.170 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.172 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.177 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.181 231315 INFO nova.virt.libvirt.driver [-] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Instance spawned successfully.#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.182 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.212 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.212 231315 DEBUG nova.virt.driver [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Emitting event <LifecycleEvent: 1763932048.146304, cc03c89f-bbbe-477a-ad7c-2f31c9125d20 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.213 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] VM Paused (Lifecycle Event)#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.224 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.225 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.225 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.226 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.226 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.227 231315 DEBUG nova.virt.libvirt.driver [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.236 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.240 231315 DEBUG nova.virt.driver [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Emitting event <LifecycleEvent: 1763932048.1729932, cc03c89f-bbbe-477a-ad7c-2f31c9125d20 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.240 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] VM Resumed (Lifecycle Event)#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.269 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.274 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.299 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.310 231315 INFO nova.compute.manager [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Took 8.50 seconds to spawn the instance on the hypervisor.#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.312 231315 DEBUG nova.compute.manager [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:07:28 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:28.314 142920 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 23 16:07:28 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:28.314 142920 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpv6y3gocv/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 23 16:07:28 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:28.203 235389 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 23 16:07:28 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:28.210 235389 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 23 16:07:28 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:28.214 235389 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Nov 23 16:07:28 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:28.215 235389 INFO oslo.privsep.daemon [-] privsep daemon running as pid 235389#033[00m
Nov 23 16:07:28 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:28.317 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[9047ab27-c000-4873-bddc-e439888975b4]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:28 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.403 231315 INFO nova.compute.manager [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Took 9.52 seconds to build instance.#033[00m
Nov 23 16:07:28 np0005532763 nova_compute[231311]: 2025-11-23 21:07:28.429 231315 DEBUG oslo_concurrency.lockutils [None req-959c51f5-b1fd-45b9-8d0f-f46c0158e563 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.607s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:28 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:28.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:28 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:28.839 235389 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:28 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:28.839 235389 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:28 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:28.839 235389 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:29.374 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[e5935078-2f42-4b72-a49f-35d857734f44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:29.375 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4330ece9-d1 in ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 23 16:07:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:29.377 235389 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4330ece9-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 23 16:07:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:29.377 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[471b3334-9a21-4a37-b6ce-853a341ce1ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:29.380 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[a64bf677-e966-4f9e-8059-a059c4015e10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:29.413 143034 DEBUG oslo.privsep.daemon [-] privsep: reply[c3da8f66-df34-4a1c-8727-9f400c1b5573]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:29.436 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[3ac43348-6269-45fa-a7f9-f418952ce0fb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:29.439 142920 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpleaazpde/privsep.sock']#033[00m
Nov 23 16:07:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:07:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:29 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:07:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:29.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:07:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:30 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:30.201 142920 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 23 16:07:30 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:30.203 142920 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpleaazpde/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 23 16:07:30 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:30.065 235405 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 23 16:07:30 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:30.073 235405 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 23 16:07:30 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:30.077 235405 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Nov 23 16:07:30 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:30.078 235405 INFO oslo.privsep.daemon [-] privsep daemon running as pid 235405#033[00m
Nov 23 16:07:30 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:30.207 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[3d7aeec3-d521-447e-b736-522a971e839d]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:30 np0005532763 nova_compute[231311]: 2025-11-23 21:07:30.215 231315 DEBUG nova.compute.manager [req-1fb15f8a-d380-4ed9-8acf-b6c9c01a5fff req-3f34ee26-7c7f-4360-ba37-c3b9627cfa70 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Received event network-vif-plugged-fa39fedb-0393-4e6b-a380-50741abeeb9d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:07:30 np0005532763 nova_compute[231311]: 2025-11-23 21:07:30.216 231315 DEBUG oslo_concurrency.lockutils [req-1fb15f8a-d380-4ed9-8acf-b6c9c01a5fff req-3f34ee26-7c7f-4360-ba37-c3b9627cfa70 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:30 np0005532763 nova_compute[231311]: 2025-11-23 21:07:30.217 231315 DEBUG oslo_concurrency.lockutils [req-1fb15f8a-d380-4ed9-8acf-b6c9c01a5fff req-3f34ee26-7c7f-4360-ba37-c3b9627cfa70 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:30 np0005532763 nova_compute[231311]: 2025-11-23 21:07:30.217 231315 DEBUG oslo_concurrency.lockutils [req-1fb15f8a-d380-4ed9-8acf-b6c9c01a5fff req-3f34ee26-7c7f-4360-ba37-c3b9627cfa70 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:30 np0005532763 nova_compute[231311]: 2025-11-23 21:07:30.218 231315 DEBUG nova.compute.manager [req-1fb15f8a-d380-4ed9-8acf-b6c9c01a5fff req-3f34ee26-7c7f-4360-ba37-c3b9627cfa70 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] No waiting events found dispatching network-vif-plugged-fa39fedb-0393-4e6b-a380-50741abeeb9d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:07:30 np0005532763 nova_compute[231311]: 2025-11-23 21:07:30.218 231315 WARNING nova.compute.manager [req-1fb15f8a-d380-4ed9-8acf-b6c9c01a5fff req-3f34ee26-7c7f-4360-ba37-c3b9627cfa70 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Received unexpected event network-vif-plugged-fa39fedb-0393-4e6b-a380-50741abeeb9d for instance with vm_state active and task_state None.#033[00m
Nov 23 16:07:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:30 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:30 np0005532763 nova_compute[231311]: 2025-11-23 21:07:30.362 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:30 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:30 np0005532763 nova_compute[231311]: 2025-11-23 21:07:30.649 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:30.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:30 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:30.710 235405 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:30 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:30.711 235405 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:30 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:30.711 235405 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.248 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[f7aec016-ddb0-46b8-aca0-d9ae3cbd9aff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:31 np0005532763 NetworkManager[48849]: <info>  [1763932051.2650] manager: (tap4330ece9-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.265 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[3cce8100-7664-4727-8bba-d52cdb3c5b51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:31 np0005532763 systemd-udevd[235418]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.300 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[576f2ff9-de16-490c-9a9c-640b559acf44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.303 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[fe903fa0-5098-4e8d-927c-c64122bd1a8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:31 np0005532763 NetworkManager[48849]: <info>  [1763932051.3280] device (tap4330ece9-d0): carrier: link connected
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.340 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[abe2e663-bd9b-4712-b0ed-3c4d4da0ff9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.359 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[e1bd50f2-eff7-4535-ba64-b4b5f117e287]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4330ece9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:cc:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 400086, 'reachable_time': 38500, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235436, 'error': None, 'target': 'ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.380 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[409da8cd-843d-48d6-9236-3fef153e2945]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe28:ccc0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 400086, 'tstamp': 400086}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235437, 'error': None, 'target': 'ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.396 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[e218ff93-60fb-4a2a-9db5-b437808895ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4330ece9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:cc:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 400086, 'reachable_time': 38500, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235438, 'error': None, 'target': 'ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.432 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[40b82ba8-e6fe-4a9c-b289-fe6b097ee608]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.505 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[d1329a29-c662-4f36-83ce-51bee56c9095]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.508 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4330ece9-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.508 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.509 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4330ece9-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:07:31 np0005532763 NetworkManager[48849]: <info>  [1763932051.5141] manager: (tap4330ece9-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Nov 23 16:07:31 np0005532763 kernel: tap4330ece9-d0: entered promiscuous mode
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.519 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4330ece9-d0, col_values=(('external_ids', {'iface-id': '8d94bace-977f-44bc-ad82-0e61b6761f81'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:07:31 np0005532763 nova_compute[231311]: 2025-11-23 21:07:31.512 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:31 np0005532763 ovn_controller[133425]: 2025-11-23T21:07:31Z|00031|binding|INFO|Releasing lport 8d94bace-977f-44bc-ad82-0e61b6761f81 from this chassis (sb_readonly=0)
Nov 23 16:07:31 np0005532763 nova_compute[231311]: 2025-11-23 21:07:31.551 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.552 142920 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4330ece9-d3f3-4995-b96e-abf6bfccd4cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4330ece9-d3f3-4995-b96e-abf6bfccd4cc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 23 16:07:31 np0005532763 nova_compute[231311]: 2025-11-23 21:07:31.552 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.554 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[001d4db5-f83e-4c7b-8fbb-414d0e506bee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.557 142920 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: global
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    log         /dev/log local0 debug
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    log-tag     haproxy-metadata-proxy-4330ece9-d3f3-4995-b96e-abf6bfccd4cc
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    user        root
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    group       root
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    maxconn     1024
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    pidfile     /var/lib/neutron/external/pids/4330ece9-d3f3-4995-b96e-abf6bfccd4cc.pid.haproxy
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    daemon
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: defaults
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    log global
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    mode http
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    option httplog
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    option dontlognull
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    option http-server-close
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    option forwardfor
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    retries                 3
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    timeout http-request    30s
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    timeout connect         30s
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    timeout client          32s
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    timeout server          32s
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    timeout http-keep-alive 30s
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: listen listener
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    bind 169.254.169.254:80
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    server metadata /var/lib/neutron/metadata_proxy
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]:    http-request add-header X-OVN-Network-ID 4330ece9-d3f3-4995-b96e-abf6bfccd4cc
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 23 16:07:31 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:31.559 142920 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc', 'env', 'PROCESS_TAG=haproxy-4330ece9-d3f3-4995-b96e-abf6bfccd4cc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4330ece9-d3f3-4995-b96e-abf6bfccd4cc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 23 16:07:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:31 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:31.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:32 np0005532763 podman[235472]: 2025-11-23 21:07:32.096044633 +0000 UTC m=+0.079278808 container create 6b1cb2e9a7625636d2f2b1f9b5e07eea934f09089940a575da45dfdf7b1f3886 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 16:07:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:32 np0005532763 podman[235472]: 2025-11-23 21:07:32.046537405 +0000 UTC m=+0.029771640 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 23 16:07:32 np0005532763 systemd[1]: Started libpod-conmon-6b1cb2e9a7625636d2f2b1f9b5e07eea934f09089940a575da45dfdf7b1f3886.scope.
Nov 23 16:07:32 np0005532763 systemd[1]: Started libcrun container.
Nov 23 16:07:32 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a01ccee854440c1b5756d72900a02f9be5e9b26a3e6631976d782e2a024b91c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 16:07:32 np0005532763 podman[235472]: 2025-11-23 21:07:32.206562859 +0000 UTC m=+0.189797094 container init 6b1cb2e9a7625636d2f2b1f9b5e07eea934f09089940a575da45dfdf7b1f3886 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 23 16:07:32 np0005532763 podman[235472]: 2025-11-23 21:07:32.217490503 +0000 UTC m=+0.200724698 container start 6b1cb2e9a7625636d2f2b1f9b5e07eea934f09089940a575da45dfdf7b1f3886 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 23 16:07:32 np0005532763 neutron-haproxy-ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc[235488]: [NOTICE]   (235492) : New worker (235494) forked
Nov 23 16:07:32 np0005532763 neutron-haproxy-ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc[235488]: [NOTICE]   (235492) : Loading success.
Nov 23 16:07:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:32 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:32 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:32.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:33 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:33.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:34 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:34 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:34 np0005532763 nova_compute[231311]: 2025-11-23 21:07:34.613 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:34 np0005532763 NetworkManager[48849]: <info>  [1763932054.6160] manager: (patch-provnet-ce139dcc-0def-41ea-bc8f-4f8d9359e223-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/27)
Nov 23 16:07:34 np0005532763 NetworkManager[48849]: <info>  [1763932054.6164] device (patch-provnet-ce139dcc-0def-41ea-bc8f-4f8d9359e223-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 16:07:34 np0005532763 NetworkManager[48849]: <info>  [1763932054.6176] manager: (patch-br-int-to-provnet-ce139dcc-0def-41ea-bc8f-4f8d9359e223): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/28)
Nov 23 16:07:34 np0005532763 NetworkManager[48849]: <info>  [1763932054.6179] device (patch-br-int-to-provnet-ce139dcc-0def-41ea-bc8f-4f8d9359e223)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 23 16:07:34 np0005532763 NetworkManager[48849]: <info>  [1763932054.6187] manager: (patch-provnet-ce139dcc-0def-41ea-bc8f-4f8d9359e223-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Nov 23 16:07:34 np0005532763 NetworkManager[48849]: <info>  [1763932054.6192] manager: (patch-br-int-to-provnet-ce139dcc-0def-41ea-bc8f-4f8d9359e223): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Nov 23 16:07:34 np0005532763 NetworkManager[48849]: <info>  [1763932054.6196] device (patch-provnet-ce139dcc-0def-41ea-bc8f-4f8d9359e223-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 23 16:07:34 np0005532763 NetworkManager[48849]: <info>  [1763932054.6199] device (patch-br-int-to-provnet-ce139dcc-0def-41ea-bc8f-4f8d9359e223)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 23 16:07:34 np0005532763 nova_compute[231311]: 2025-11-23 21:07:34.662 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:34 np0005532763 ovn_controller[133425]: 2025-11-23T21:07:34Z|00032|binding|INFO|Releasing lport 8d94bace-977f-44bc-ad82-0e61b6761f81 from this chassis (sb_readonly=0)
Nov 23 16:07:34 np0005532763 nova_compute[231311]: 2025-11-23 21:07:34.675 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:07:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:34.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:07:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:07:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:35 np0005532763 nova_compute[231311]: 2025-11-23 21:07:35.365 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:35 np0005532763 nova_compute[231311]: 2025-11-23 21:07:35.701 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:35 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:07:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:35.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:07:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:36 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:36 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:36.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:37 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000055s ======
Nov 23 16:07:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:37.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Nov 23 16:07:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:38 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:38 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:07:38.661122) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932058661187, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 900, "num_deletes": 251, "total_data_size": 1860208, "memory_usage": 1888512, "flush_reason": "Manual Compaction"}
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932058671507, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1227461, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25290, "largest_seqno": 26184, "table_properties": {"data_size": 1223306, "index_size": 1871, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9656, "raw_average_key_size": 19, "raw_value_size": 1214752, "raw_average_value_size": 2504, "num_data_blocks": 83, "num_entries": 485, "num_filter_entries": 485, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763931996, "oldest_key_time": 1763931996, "file_creation_time": 1763932058, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 10535 microseconds, and 6389 cpu microseconds.
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:07:38.671654) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1227461 bytes OK
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:07:38.671765) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:07:38.673469) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:07:38.673493) EVENT_LOG_v1 {"time_micros": 1763932058673486, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:07:38.673520) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 1855649, prev total WAL file size 1855649, number of live WAL files 2.
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:07:38.675164) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1198KB)], [48(13MB)]
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932058675214, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 15195392, "oldest_snapshot_seqno": -1}
Nov 23 16:07:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:38.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5388 keys, 13042532 bytes, temperature: kUnknown
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932058750473, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 13042532, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13006889, "index_size": 21069, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13509, "raw_key_size": 138482, "raw_average_key_size": 25, "raw_value_size": 12909728, "raw_average_value_size": 2396, "num_data_blocks": 855, "num_entries": 5388, "num_filter_entries": 5388, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763932058, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:07:38.750682) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 13042532 bytes
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:07:38.752139) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 201.7 rd, 173.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 13.3 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(23.0) write-amplify(10.6) OK, records in: 5908, records dropped: 520 output_compression: NoCompression
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:07:38.752172) EVENT_LOG_v1 {"time_micros": 1763932058752158, "job": 28, "event": "compaction_finished", "compaction_time_micros": 75322, "compaction_time_cpu_micros": 29011, "output_level": 6, "num_output_files": 1, "total_output_size": 13042532, "num_input_records": 5908, "num_output_records": 5388, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932058752709, "job": 28, "event": "table_file_deletion", "file_number": 50}
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932058757026, "job": 28, "event": "table_file_deletion", "file_number": 48}
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:07:38.675059) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:07:38.757139) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:07:38.757150) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:07:38.757154) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:07:38.757158) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:07:38 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:07:38.757168) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:07:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:07:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:39 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:39.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:40 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:40 np0005532763 nova_compute[231311]: 2025-11-23 21:07:40.406 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:40 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:07:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:40.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:07:40 np0005532763 nova_compute[231311]: 2025-11-23 21:07:40.704 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:41 np0005532763 podman[235516]: 2025-11-23 21:07:41.211592145 +0000 UTC m=+0.080043589 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 23 16:07:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:41 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:41 np0005532763 ovn_controller[133425]: 2025-11-23T21:07:41Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6c:43:c1 10.100.0.29
Nov 23 16:07:41 np0005532763 ovn_controller[133425]: 2025-11-23T21:07:41Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6c:43:c1 10.100.0.29
Nov 23 16:07:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:07:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:41.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:07:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:42 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:42 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:07:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:42.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:07:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:43 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:43.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:44 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5800048c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:44 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5800048c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:44.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:07:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:45 np0005532763 nova_compute[231311]: 2025-11-23 21:07:45.462 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:45 np0005532763 nova_compute[231311]: 2025-11-23 21:07:45.706 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:45 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5800048c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:46.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:46 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:46 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:07:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:46.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:07:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:47 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:48.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:48 np0005532763 podman[235572]: 2025-11-23 21:07:48.251243169 +0000 UTC m=+0.132023185 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 16:07:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:48 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:48 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc5740043b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:48.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.344 231315 DEBUG oslo_concurrency.lockutils [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.344 231315 DEBUG oslo_concurrency.lockutils [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.345 231315 DEBUG oslo_concurrency.lockutils [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.345 231315 DEBUG oslo_concurrency.lockutils [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.346 231315 DEBUG oslo_concurrency.lockutils [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.348 231315 INFO nova.compute.manager [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Terminating instance#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.350 231315 DEBUG nova.compute.manager [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 23 16:07:49 np0005532763 kernel: tapfa39fedb-03 (unregistering): left promiscuous mode
Nov 23 16:07:49 np0005532763 NetworkManager[48849]: <info>  [1763932069.4162] device (tapfa39fedb-03): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 23 16:07:49 np0005532763 ovn_controller[133425]: 2025-11-23T21:07:49Z|00033|binding|INFO|Releasing lport fa39fedb-0393-4e6b-a380-50741abeeb9d from this chassis (sb_readonly=0)
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.429 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:49 np0005532763 ovn_controller[133425]: 2025-11-23T21:07:49Z|00034|binding|INFO|Setting lport fa39fedb-0393-4e6b-a380-50741abeeb9d down in Southbound
Nov 23 16:07:49 np0005532763 ovn_controller[133425]: 2025-11-23T21:07:49Z|00035|binding|INFO|Removing iface tapfa39fedb-03 ovn-installed in OVS
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.432 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:49 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:49.439 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:43:c1 10.100.0.29'], port_security=['fa:16:3e:6c:43:c1 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': 'cc03c89f-bbbe-477a-ad7c-2f31c9125d20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4330ece9-d3f3-4995-b96e-abf6bfccd4cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782593db60784ab8bff41fe87d72ff5f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '69d27e0d-1270-462f-9117-bfea049ad9d0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7e0d310c-ba36-446f-b044-c8e4856a70c5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], logical_port=fa39fedb-0393-4e6b-a380-50741abeeb9d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:07:49 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:49.442 142920 INFO neutron.agent.ovn.metadata.agent [-] Port fa39fedb-0393-4e6b-a380-50741abeeb9d in datapath 4330ece9-d3f3-4995-b96e-abf6bfccd4cc unbound from our chassis#033[00m
Nov 23 16:07:49 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:49.444 142920 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4330ece9-d3f3-4995-b96e-abf6bfccd4cc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 23 16:07:49 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:49.446 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[86e60bf6-0916-4157-a839-8da38e71bda9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:49 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:49.447 142920 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc namespace which is not needed anymore#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.470 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:49 np0005532763 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Nov 23 16:07:49 np0005532763 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 14.030s CPU time.
Nov 23 16:07:49 np0005532763 systemd-machined[194484]: Machine qemu-1-instance-00000002 terminated.
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.599 231315 INFO nova.virt.libvirt.driver [-] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Instance destroyed successfully.#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.600 231315 DEBUG nova.objects.instance [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'resources' on Instance uuid cc03c89f-bbbe-477a-ad7c-2f31c9125d20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.616 231315 DEBUG nova.virt.libvirt.vif [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-23T21:07:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-749436504',display_name='tempest-TestNetworkBasicOps-server-749436504',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-749436504',id=2,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKi8custUVsiaaq6kl1F/UzCBdG1D+LU2LLwnWBAVFFHysn/MJX0KH/er5rYGFW/a70JpuJYuPjgSDIy48bKBG98pVDfg3bXEvJWC00N0L2Ff6HHmtCb6nNrW876ZSacXA==',key_name='tempest-TestNetworkBasicOps-1109475132',keypairs=<?>,launch_index=0,launched_at=2025-11-23T21:07:28Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-10j1rhls',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-23T21:07:28Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=cc03c89f-bbbe-477a-ad7c-2f31c9125d20,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "address": "fa:16:3e:6c:43:c1", "network": {"id": "4330ece9-d3f3-4995-b96e-abf6bfccd4cc", "bridge": "br-int", "label": "tempest-network-smoke--390742040", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa39fedb-03", "ovs_interfaceid": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.617 231315 DEBUG nova.network.os_vif_util [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "address": "fa:16:3e:6c:43:c1", "network": {"id": "4330ece9-d3f3-4995-b96e-abf6bfccd4cc", "bridge": "br-int", "label": "tempest-network-smoke--390742040", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa39fedb-03", "ovs_interfaceid": "fa39fedb-0393-4e6b-a380-50741abeeb9d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.619 231315 DEBUG nova.network.os_vif_util [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:43:c1,bridge_name='br-int',has_traffic_filtering=True,id=fa39fedb-0393-4e6b-a380-50741abeeb9d,network=Network(4330ece9-d3f3-4995-b96e-abf6bfccd4cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa39fedb-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.620 231315 DEBUG os_vif [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:43:c1,bridge_name='br-int',has_traffic_filtering=True,id=fa39fedb-0393-4e6b-a380-50741abeeb9d,network=Network(4330ece9-d3f3-4995-b96e-abf6bfccd4cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa39fedb-03') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.624 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.624 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa39fedb-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.629 231315 DEBUG nova.compute.manager [req-b27f521b-17a4-47a9-9d2a-18ccbb758997 req-181d60e9-7756-40d6-acae-2226cae75c12 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Received event network-vif-unplugged-fa39fedb-0393-4e6b-a380-50741abeeb9d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.629 231315 DEBUG oslo_concurrency.lockutils [req-b27f521b-17a4-47a9-9d2a-18ccbb758997 req-181d60e9-7756-40d6-acae-2226cae75c12 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.630 231315 DEBUG oslo_concurrency.lockutils [req-b27f521b-17a4-47a9-9d2a-18ccbb758997 req-181d60e9-7756-40d6-acae-2226cae75c12 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.630 231315 DEBUG oslo_concurrency.lockutils [req-b27f521b-17a4-47a9-9d2a-18ccbb758997 req-181d60e9-7756-40d6-acae-2226cae75c12 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.630 231315 DEBUG nova.compute.manager [req-b27f521b-17a4-47a9-9d2a-18ccbb758997 req-181d60e9-7756-40d6-acae-2226cae75c12 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] No waiting events found dispatching network-vif-unplugged-fa39fedb-0393-4e6b-a380-50741abeeb9d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.631 231315 DEBUG nova.compute.manager [req-b27f521b-17a4-47a9-9d2a-18ccbb758997 req-181d60e9-7756-40d6-acae-2226cae75c12 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Received event network-vif-unplugged-fa39fedb-0393-4e6b-a380-50741abeeb9d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.631 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.633 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.636 231315 INFO os_vif [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:43:c1,bridge_name='br-int',has_traffic_filtering=True,id=fa39fedb-0393-4e6b-a380-50741abeeb9d,network=Network(4330ece9-d3f3-4995-b96e-abf6bfccd4cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa39fedb-03')#033[00m
Nov 23 16:07:49 np0005532763 neutron-haproxy-ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc[235488]: [NOTICE]   (235492) : haproxy version is 2.8.14-c23fe91
Nov 23 16:07:49 np0005532763 neutron-haproxy-ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc[235488]: [NOTICE]   (235492) : path to executable is /usr/sbin/haproxy
Nov 23 16:07:49 np0005532763 neutron-haproxy-ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc[235488]: [WARNING]  (235492) : Exiting Master process...
Nov 23 16:07:49 np0005532763 neutron-haproxy-ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc[235488]: [ALERT]    (235492) : Current worker (235494) exited with code 143 (Terminated)
Nov 23 16:07:49 np0005532763 neutron-haproxy-ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc[235488]: [WARNING]  (235492) : All workers exited. Exiting... (0)
Nov 23 16:07:49 np0005532763 systemd[1]: libpod-6b1cb2e9a7625636d2f2b1f9b5e07eea934f09089940a575da45dfdf7b1f3886.scope: Deactivated successfully.
Nov 23 16:07:49 np0005532763 podman[235626]: 2025-11-23 21:07:49.674166242 +0000 UTC m=+0.076627043 container died 6b1cb2e9a7625636d2f2b1f9b5e07eea934f09089940a575da45dfdf7b1f3886 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 16:07:49 np0005532763 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b1cb2e9a7625636d2f2b1f9b5e07eea934f09089940a575da45dfdf7b1f3886-userdata-shm.mount: Deactivated successfully.
Nov 23 16:07:49 np0005532763 systemd[1]: var-lib-containers-storage-overlay-9a01ccee854440c1b5756d72900a02f9be5e9b26a3e6631976d782e2a024b91c-merged.mount: Deactivated successfully.
Nov 23 16:07:49 np0005532763 podman[235626]: 2025-11-23 21:07:49.731787536 +0000 UTC m=+0.134248307 container cleanup 6b1cb2e9a7625636d2f2b1f9b5e07eea934f09089940a575da45dfdf7b1f3886 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:07:49 np0005532763 systemd[1]: libpod-conmon-6b1cb2e9a7625636d2f2b1f9b5e07eea934f09089940a575da45dfdf7b1f3886.scope: Deactivated successfully.
Nov 23 16:07:49 np0005532763 podman[235684]: 2025-11-23 21:07:49.83503795 +0000 UTC m=+0.067986313 container remove 6b1cb2e9a7625636d2f2b1f9b5e07eea934f09089940a575da45dfdf7b1f3886 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 16:07:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:07:49 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:49.844 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[d15e1a7b-f11f-4236-9cd2-e8c1a21e4a19]: (4, ('Sun Nov 23 09:07:49 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc (6b1cb2e9a7625636d2f2b1f9b5e07eea934f09089940a575da45dfdf7b1f3886)\n6b1cb2e9a7625636d2f2b1f9b5e07eea934f09089940a575da45dfdf7b1f3886\nSun Nov 23 09:07:49 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc (6b1cb2e9a7625636d2f2b1f9b5e07eea934f09089940a575da45dfdf7b1f3886)\n6b1cb2e9a7625636d2f2b1f9b5e07eea934f09089940a575da45dfdf7b1f3886\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:49 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:49.847 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[43a9b1fe-f857-441d-8f30-6c21b90cc4ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:49 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:49.848 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4330ece9-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.851 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:49 np0005532763 kernel: tap4330ece9-d0: left promiscuous mode
Nov 23 16:07:49 np0005532763 nova_compute[231311]: 2025-11-23 21:07:49.879 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:49 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:49.883 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[71b376ac-9256-4b1d-ad04-c0f91888e9ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:49 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:49.900 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[406847ab-2fed-4472-8c27-980103ad2a99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:49 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:49.902 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[9152ba3f-693f-42dd-b12f-17d4c59f0d3a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:49 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:49.927 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[923e6563-6530-4ca0-b778-fe5c29ec6ee0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 400078, 'reachable_time': 35186, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235700, 'error': None, 'target': 'ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:49 np0005532763 systemd[1]: run-netns-ovnmeta\x2d4330ece9\x2dd3f3\x2d4995\x2db96e\x2dabf6bfccd4cc.mount: Deactivated successfully.
Nov 23 16:07:49 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:49.944 143034 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4330ece9-d3f3-4995-b96e-abf6bfccd4cc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 23 16:07:49 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:49.946 143034 DEBUG oslo.privsep.daemon [-] privsep: reply[640ea645-6946-4515-82b3-3e3346b7b218]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:07:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:49 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:50.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:50 np0005532763 nova_compute[231311]: 2025-11-23 21:07:50.136 231315 INFO nova.virt.libvirt.driver [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Deleting instance files /var/lib/nova/instances/cc03c89f-bbbe-477a-ad7c-2f31c9125d20_del#033[00m
Nov 23 16:07:50 np0005532763 nova_compute[231311]: 2025-11-23 21:07:50.137 231315 INFO nova.virt.libvirt.driver [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Deletion of /var/lib/nova/instances/cc03c89f-bbbe-477a-ad7c-2f31c9125d20_del complete#033[00m
Nov 23 16:07:50 np0005532763 nova_compute[231311]: 2025-11-23 21:07:50.236 231315 DEBUG nova.virt.libvirt.host [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Nov 23 16:07:50 np0005532763 nova_compute[231311]: 2025-11-23 21:07:50.237 231315 INFO nova.virt.libvirt.host [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] UEFI support detected#033[00m
Nov 23 16:07:50 np0005532763 nova_compute[231311]: 2025-11-23 21:07:50.239 231315 INFO nova.compute.manager [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Took 0.89 seconds to destroy the instance on the hypervisor.#033[00m
Nov 23 16:07:50 np0005532763 nova_compute[231311]: 2025-11-23 21:07:50.240 231315 DEBUG oslo.service.loopingcall [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 23 16:07:50 np0005532763 nova_compute[231311]: 2025-11-23 21:07:50.240 231315 DEBUG nova.compute.manager [-] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 23 16:07:50 np0005532763 nova_compute[231311]: 2025-11-23 21:07:50.241 231315 DEBUG nova.network.neutron [-] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 23 16:07:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:50 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:50 np0005532763 nova_compute[231311]: 2025-11-23 21:07:50.517 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:50 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:07:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:50.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:07:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.164 231315 DEBUG nova.network.neutron [-] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.187 231315 INFO nova.compute.manager [-] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Took 0.95 seconds to deallocate network for instance.#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.231 231315 DEBUG oslo_concurrency.lockutils [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.231 231315 DEBUG oslo_concurrency.lockutils [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.286 231315 DEBUG nova.compute.manager [req-aecc4127-ff3d-40a0-9916-5a860a17b6b6 req-70bf153e-2cbd-4c79-956b-ecfe33a5aa91 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Received event network-vif-deleted-fa39fedb-0393-4e6b-a380-50741abeeb9d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.290 231315 DEBUG oslo_concurrency.processutils [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.707 231315 DEBUG nova.compute.manager [req-3baf83c1-5d8f-47e4-893d-7842587525dc req-a21f083b-796f-4c5c-b1e7-d571fa40451b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Received event network-vif-plugged-fa39fedb-0393-4e6b-a380-50741abeeb9d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.708 231315 DEBUG oslo_concurrency.lockutils [req-3baf83c1-5d8f-47e4-893d-7842587525dc req-a21f083b-796f-4c5c-b1e7-d571fa40451b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.708 231315 DEBUG oslo_concurrency.lockutils [req-3baf83c1-5d8f-47e4-893d-7842587525dc req-a21f083b-796f-4c5c-b1e7-d571fa40451b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.709 231315 DEBUG oslo_concurrency.lockutils [req-3baf83c1-5d8f-47e4-893d-7842587525dc req-a21f083b-796f-4c5c-b1e7-d571fa40451b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.709 231315 DEBUG nova.compute.manager [req-3baf83c1-5d8f-47e4-893d-7842587525dc req-a21f083b-796f-4c5c-b1e7-d571fa40451b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] No waiting events found dispatching network-vif-plugged-fa39fedb-0393-4e6b-a380-50741abeeb9d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.709 231315 WARNING nova.compute.manager [req-3baf83c1-5d8f-47e4-893d-7842587525dc req-a21f083b-796f-4c5c-b1e7-d571fa40451b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Received unexpected event network-vif-plugged-fa39fedb-0393-4e6b-a380-50741abeeb9d for instance with vm_state deleted and task_state None.#033[00m
Nov 23 16:07:51 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:07:51 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2132313810' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.767 231315 DEBUG oslo_concurrency.processutils [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.775 231315 DEBUG nova.compute.provider_tree [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Updating inventory in ProviderTree for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.823 231315 ERROR nova.scheduler.client.report [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [req-29b7d2e4-c456-4c02-908f-2de6b65ee229] Failed to update inventory to [{'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 20c32e0a-de2c-427c-9273-fac11e2660f4.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-29b7d2e4-c456-4c02-908f-2de6b65ee229"}]}#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.843 231315 DEBUG nova.scheduler.client.report [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Refreshing inventories for resource provider 20c32e0a-de2c-427c-9273-fac11e2660f4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.859 231315 DEBUG nova.scheduler.client.report [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Updating ProviderTree inventory for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.859 231315 DEBUG nova.compute.provider_tree [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Updating inventory in ProviderTree for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.882 231315 DEBUG nova.scheduler.client.report [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Refreshing aggregate associations for resource provider 20c32e0a-de2c-427c-9273-fac11e2660f4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 23 16:07:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.911 231315 DEBUG nova.scheduler.client.report [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Refreshing trait associations for resource provider 20c32e0a-de2c-427c-9273-fac11e2660f4, traits: COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AVX2,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_FMA3,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,HW_CPU_X86_SVM,HW_CPU_X86_BMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 23 16:07:51 np0005532763 nova_compute[231311]: 2025-11-23 21:07:51.955 231315 DEBUG oslo_concurrency.processutils [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:07:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:51 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:52.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:52 np0005532763 podman[235730]: 2025-11-23 21:07:52.21386748 +0000 UTC m=+0.087791194 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 16:07:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:52.220 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:07:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:52.221 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:07:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:07:52.221 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:52 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:52 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:07:52 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1725535026' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:07:52 np0005532763 nova_compute[231311]: 2025-11-23 21:07:52.443 231315 DEBUG oslo_concurrency.processutils [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:07:52 np0005532763 nova_compute[231311]: 2025-11-23 21:07:52.452 231315 DEBUG nova.compute.provider_tree [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Updating inventory in ProviderTree for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 23 16:07:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:52 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:52 np0005532763 nova_compute[231311]: 2025-11-23 21:07:52.597 231315 DEBUG nova.scheduler.client.report [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Updated inventory for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 23 16:07:52 np0005532763 nova_compute[231311]: 2025-11-23 21:07:52.598 231315 DEBUG nova.compute.provider_tree [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Updating resource provider 20c32e0a-de2c-427c-9273-fac11e2660f4 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 23 16:07:52 np0005532763 nova_compute[231311]: 2025-11-23 21:07:52.598 231315 DEBUG nova.compute.provider_tree [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Updating inventory in ProviderTree for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 23 16:07:52 np0005532763 nova_compute[231311]: 2025-11-23 21:07:52.627 231315 DEBUG oslo_concurrency.lockutils [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.396s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:52 np0005532763 nova_compute[231311]: 2025-11-23 21:07:52.657 231315 INFO nova.scheduler.client.report [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Deleted allocations for instance cc03c89f-bbbe-477a-ad7c-2f31c9125d20#033[00m
Nov 23 16:07:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:52.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:52 np0005532763 nova_compute[231311]: 2025-11-23 21:07:52.759 231315 DEBUG oslo_concurrency.lockutils [None req-e4017730-72e2-491d-acdf-8ad33e27a2c0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "cc03c89f-bbbe-477a-ad7c-2f31c9125d20" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.415s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:07:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:53 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc558003630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:54.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:54 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc55c001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:54 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Nov 23 16:07:54 np0005532763 nova_compute[231311]: 2025-11-23 21:07:54.627 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:54.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:07:54 np0005532763 nova_compute[231311]: 2025-11-23 21:07:54.883 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:55 np0005532763 nova_compute[231311]: 2025-11-23 21:07:55.017 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:55 np0005532763 nova_compute[231311]: 2025-11-23 21:07:55.562 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:55 np0005532763 kernel: ganesha.nfsd[234598]: segfault at 50 ip 00007fc631c6432e sp 00007fc600ff8210 error 4 in libntirpc.so.5.8[7fc631c49000+2c000] likely on CPU 2 (core 0, socket 2)
Nov 23 16:07:55 np0005532763 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Nov 23 16:07:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[233105]: 23/11/2025 21:07:55 : epoch 692376df : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fc564001090 fd 39 proxy ignored for local
Nov 23 16:07:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:07:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:56.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:07:56 np0005532763 systemd[1]: Started Process Core Dump (PID 235774/UID 0).
Nov 23 16:07:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:56.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:57 np0005532763 systemd-coredump[235775]: Process 233109 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 63:#012#0  0x00007fc631c6432e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Nov 23 16:07:57 np0005532763 systemd[1]: systemd-coredump@14-235774-0.service: Deactivated successfully.
Nov 23 16:07:57 np0005532763 systemd[1]: systemd-coredump@14-235774-0.service: Consumed 1.240s CPU time.
Nov 23 16:07:57 np0005532763 podman[235781]: 2025-11-23 21:07:57.346946808 +0000 UTC m=+0.026964501 container died 7921b599458ffbb72715ff70d93a9d72e81e19b2899dc7ee74263bca36278a23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Nov 23 16:07:57 np0005532763 systemd[1]: var-lib-containers-storage-overlay-07061cdc3c556e8090fbe0c3cfc396558d200bcbc400be2482075806b3283578-merged.mount: Deactivated successfully.
Nov 23 16:07:57 np0005532763 podman[235781]: 2025-11-23 21:07:57.409209531 +0000 UTC m=+0.089227184 container remove 7921b599458ffbb72715ff70d93a9d72e81e19b2899dc7ee74263bca36278a23 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Nov 23 16:07:57 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Main process exited, code=exited, status=139/n/a
Nov 23 16:07:57 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Failed with result 'exit-code'.
Nov 23 16:07:57 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 2.088s CPU time.
Nov 23 16:07:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:07:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:07:58.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:07:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:58 np0005532763 nova_compute[231311]: 2025-11-23 21:07:58.380 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:07:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:07:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:07:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:07:58.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:07:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:07:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:07:59 np0005532763 nova_compute[231311]: 2025-11-23 21:07:59.629 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:07:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:07:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:07:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:00.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:00 np0005532763 nova_compute[231311]: 2025-11-23 21:08:00.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:08:00 np0005532763 nova_compute[231311]: 2025-11-23 21:08:00.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:08:00 np0005532763 nova_compute[231311]: 2025-11-23 21:08:00.605 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:00.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:01 np0005532763 nova_compute[231311]: 2025-11-23 21:08:01.379 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:08:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210801 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 16:08:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:08:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:02.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:08:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:02 np0005532763 nova_compute[231311]: 2025-11-23 21:08:02.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:08:02 np0005532763 nova_compute[231311]: 2025-11-23 21:08:02.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:08:02 np0005532763 nova_compute[231311]: 2025-11-23 21:08:02.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:08:02 np0005532763 nova_compute[231311]: 2025-11-23 21:08:02.396 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:08:02 np0005532763 nova_compute[231311]: 2025-11-23 21:08:02.396 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:08:02 np0005532763 nova_compute[231311]: 2025-11-23 21:08:02.397 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:08:02 np0005532763 nova_compute[231311]: 2025-11-23 21:08:02.420 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:08:02 np0005532763 nova_compute[231311]: 2025-11-23 21:08:02.421 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:08:02 np0005532763 nova_compute[231311]: 2025-11-23 21:08:02.421 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:08:02 np0005532763 nova_compute[231311]: 2025-11-23 21:08:02.421 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:08:02 np0005532763 nova_compute[231311]: 2025-11-23 21:08:02.422 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:08:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:02.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:02 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:08:02 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/444257315' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:08:02 np0005532763 nova_compute[231311]: 2025-11-23 21:08:02.900 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:08:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:03 np0005532763 nova_compute[231311]: 2025-11-23 21:08:03.174 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:08:03 np0005532763 nova_compute[231311]: 2025-11-23 21:08:03.177 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4982MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:08:03 np0005532763 nova_compute[231311]: 2025-11-23 21:08:03.178 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:08:03 np0005532763 nova_compute[231311]: 2025-11-23 21:08:03.178 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:08:03 np0005532763 nova_compute[231311]: 2025-11-23 21:08:03.250 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:08:03 np0005532763 nova_compute[231311]: 2025-11-23 21:08:03.250 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:08:03 np0005532763 nova_compute[231311]: 2025-11-23 21:08:03.268 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:08:03 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:08:03 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2500973786' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:08:03 np0005532763 nova_compute[231311]: 2025-11-23 21:08:03.750 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:08:03 np0005532763 nova_compute[231311]: 2025-11-23 21:08:03.757 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:08:03 np0005532763 nova_compute[231311]: 2025-11-23 21:08:03.778 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:08:03 np0005532763 nova_compute[231311]: 2025-11-23 21:08:03.814 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:08:03 np0005532763 nova_compute[231311]: 2025-11-23 21:08:03.814 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.636s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:08:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:04.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:04 np0005532763 nova_compute[231311]: 2025-11-23 21:08:04.598 231315 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763932069.5970893, cc03c89f-bbbe-477a-ad7c-2f31c9125d20 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:08:04 np0005532763 nova_compute[231311]: 2025-11-23 21:08:04.598 231315 INFO nova.compute.manager [-] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] VM Stopped (Lifecycle Event)#033[00m
Nov 23 16:08:04 np0005532763 nova_compute[231311]: 2025-11-23 21:08:04.615 231315 DEBUG nova.compute.manager [None req-73424a64-4289-4f20-aee4-645e5fd1e4a6 - - - - - -] [instance: cc03c89f-bbbe-477a-ad7c-2f31c9125d20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:08:04 np0005532763 nova_compute[231311]: 2025-11-23 21:08:04.632 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:04.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:04 np0005532763 nova_compute[231311]: 2025-11-23 21:08:04.801 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:08:04 np0005532763 nova_compute[231311]: 2025-11-23 21:08:04.802 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:08:04 np0005532763 nova_compute[231311]: 2025-11-23 21:08:04.802 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:08:04 np0005532763 nova_compute[231311]: 2025-11-23 21:08:04.803 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:08:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:08:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:05 np0005532763 nova_compute[231311]: 2025-11-23 21:08:05.628 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:06.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/210806 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 16:08:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:06.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:07 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Scheduled restart job, restart counter is at 15.
Nov 23 16:08:07 np0005532763 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 16:08:07 np0005532763 systemd[1]: ceph-03808be8-ae4a-5548-82e6-4a294f1bc627@nfs.cephfs.1.0.compute-2.dqbktw.service: Consumed 2.088s CPU time.
Nov 23 16:08:07 np0005532763 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627...
Nov 23 16:08:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:08:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3522098395' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:08:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:08:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3522098395' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:08:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:08.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:08 np0005532763 podman[235952]: 2025-11-23 21:08:08.259430215 +0000 UTC m=+0.068486767 container create 10ce05665482e9899a7eee0ab4547bdd9a9d872d3217d9554617c432a64e912a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Nov 23 16:08:08 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2960b32c8a3aaf35547799176da473193525dc38c0b8b16bad49d7bb3271141/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Nov 23 16:08:08 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2960b32c8a3aaf35547799176da473193525dc38c0b8b16bad49d7bb3271141/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 16:08:08 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2960b32c8a3aaf35547799176da473193525dc38c0b8b16bad49d7bb3271141/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 16:08:08 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f2960b32c8a3aaf35547799176da473193525dc38c0b8b16bad49d7bb3271141/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.dqbktw-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Nov 23 16:08:08 np0005532763 podman[235952]: 2025-11-23 21:08:08.231024754 +0000 UTC m=+0.040081366 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 16:08:08 np0005532763 podman[235952]: 2025-11-23 21:08:08.337533719 +0000 UTC m=+0.146590321 container init 10ce05665482e9899a7eee0ab4547bdd9a9d872d3217d9554617c432a64e912a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Nov 23 16:08:08 np0005532763 podman[235952]: 2025-11-23 21:08:08.347005732 +0000 UTC m=+0.156062274 container start 10ce05665482e9899a7eee0ab4547bdd9a9d872d3217d9554617c432a64e912a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 16:08:08 np0005532763 bash[235952]: 10ce05665482e9899a7eee0ab4547bdd9a9d872d3217d9554617c432a64e912a
Nov 23 16:08:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Nov 23 16:08:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Nov 23 16:08:08 np0005532763 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.dqbktw for 03808be8-ae4a-5548-82e6-4a294f1bc627.
Nov 23 16:08:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Nov 23 16:08:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Nov 23 16:08:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Nov 23 16:08:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Nov 23 16:08:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Nov 23 16:08:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:08:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:08.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:09 np0005532763 nova_compute[231311]: 2025-11-23 21:08:09.635 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:08:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:10.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:10 np0005532763 nova_compute[231311]: 2025-11-23 21:08:10.674 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:08:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:10.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:08:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:11 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:08:11.350 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3a:26:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:d5:4d:db:d5:2b'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:08:11 np0005532763 nova_compute[231311]: 2025-11-23 21:08:11.352 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:11 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:08:11.351 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 23 16:08:11 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:08:11.353 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10e3bf57-dd2d-4b94-851f-925bcd297dde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:08:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:12.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:12 np0005532763 podman[236013]: 2025-11-23 21:08:12.215818131 +0000 UTC m=+0.087589229 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 16:08:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:12.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:08:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:14.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:08:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:14 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:08:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:14 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:08:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:14 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:08:14 np0005532763 nova_compute[231311]: 2025-11-23 21:08:14.638 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:14.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:08:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:15 np0005532763 nova_compute[231311]: 2025-11-23 21:08:15.712 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:16.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:16.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:18.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:18.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:18 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:08:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:18 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:08:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:18 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:08:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:19 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:08:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:19 np0005532763 podman[236040]: 2025-11-23 21:08:19.234314975 +0000 UTC m=+0.113714006 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 23 16:08:19 np0005532763 nova_compute[231311]: 2025-11-23 21:08:19.641 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:08:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:20.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:20 np0005532763 nova_compute[231311]: 2025-11-23 21:08:20.747 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:08:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:20.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:08:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:08:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:22.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:08:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:22.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:23 np0005532763 podman[236071]: 2025-11-23 21:08:23.216893322 +0000 UTC m=+0.089849282 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Nov 23 16:08:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:23 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:08:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:23 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:08:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:23 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:08:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:24 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:08:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:24.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:24 np0005532763 nova_compute[231311]: 2025-11-23 21:08:24.643 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:08:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:24.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:08:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:08:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:25 np0005532763 nova_compute[231311]: 2025-11-23 21:08:25.784 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:26.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:26 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:08:26 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:08:26 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:08:26 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:08:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:08:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:26.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:08:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:08:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:28.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:08:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:08:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:28.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:08:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:28 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:08:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:28 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:08:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:28 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:08:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:29 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:08:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:29 np0005532763 nova_compute[231311]: 2025-11-23 21:08:29.646 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:08:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:30.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:30 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:08:30 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:08:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:08:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:30.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:08:30 np0005532763 nova_compute[231311]: 2025-11-23 21:08:30.838 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:32.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:08:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:32.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:08:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:33 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:08:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:33 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:08:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:33 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:08:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:34 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:08:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:34.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:34 np0005532763 nova_compute[231311]: 2025-11-23 21:08:34.649 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:34.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:08:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:35 np0005532763 nova_compute[231311]: 2025-11-23 21:08:35.841 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:36.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:36.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:38.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:38.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:38 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:08:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:38 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:08:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:38 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:08:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:39 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:08:39 np0005532763 ovn_controller[133425]: 2025-11-23T21:08:39Z|00036|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Nov 23 16:08:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:39 np0005532763 nova_compute[231311]: 2025-11-23 21:08:39.652 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:08:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:08:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:40.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:08:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:08:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:40.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:08:40 np0005532763 nova_compute[231311]: 2025-11-23 21:08:40.874 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:42.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:42.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:43 np0005532763 podman[236243]: 2025-11-23 21:08:43.232724081 +0000 UTC m=+0.097614777 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 16:08:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:43 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:08:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:43 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:08:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:43 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:08:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:44 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:08:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:44.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:44 np0005532763 nova_compute[231311]: 2025-11-23 21:08:44.655 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:44.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:08:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:45 np0005532763 nova_compute[231311]: 2025-11-23 21:08:45.875 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:46.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:46.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:08:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:48.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:08:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:48.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:48 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:08:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:49 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:08:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:49 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:08:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:49 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:08:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:49 np0005532763 nova_compute[231311]: 2025-11-23 21:08:49.658 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:08:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:08:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:50.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:08:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:50 np0005532763 podman[236295]: 2025-11-23 21:08:50.281728687 +0000 UTC m=+0.160619912 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 23 16:08:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:08:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:50.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:08:50 np0005532763 nova_compute[231311]: 2025-11-23 21:08:50.877 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:52.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:08:52.222 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:08:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:08:52.223 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:08:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:08:52.223 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:08:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:52.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:53 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:08:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:54 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:08:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:54 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:08:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:54 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:08:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:08:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:54.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:08:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:54 np0005532763 podman[236326]: 2025-11-23 21:08:54.184600115 +0000 UTC m=+0.065120873 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:08:54 np0005532763 nova_compute[231311]: 2025-11-23 21:08:54.660 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:54.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:08:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:55 np0005532763 nova_compute[231311]: 2025-11-23 21:08:55.879 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:08:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:56.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:08:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:56.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:08:58.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:08:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:08:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:08:58.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:08:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:58 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:08:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:58 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:08:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:58 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:08:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:08:59 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:08:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:08:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:08:59 np0005532763 nova_compute[231311]: 2025-11-23 21:08:59.701 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:08:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:08:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:08:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:00.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:00.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:00 np0005532763 nova_compute[231311]: 2025-11-23 21:09:00.882 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:01 np0005532763 nova_compute[231311]: 2025-11-23 21:09:01.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:09:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:09:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:02.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:09:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:02 np0005532763 nova_compute[231311]: 2025-11-23 21:09:02.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:09:02 np0005532763 nova_compute[231311]: 2025-11-23 21:09:02.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:09:02 np0005532763 nova_compute[231311]: 2025-11-23 21:09:02.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:09:02 np0005532763 nova_compute[231311]: 2025-11-23 21:09:02.395 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:09:02 np0005532763 nova_compute[231311]: 2025-11-23 21:09:02.396 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:09:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:02.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:03 np0005532763 nova_compute[231311]: 2025-11-23 21:09:03.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:09:03 np0005532763 nova_compute[231311]: 2025-11-23 21:09:03.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:09:03 np0005532763 nova_compute[231311]: 2025-11-23 21:09:03.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:09:03 np0005532763 nova_compute[231311]: 2025-11-23 21:09:03.405 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:09:03 np0005532763 nova_compute[231311]: 2025-11-23 21:09:03.407 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:09:03 np0005532763 nova_compute[231311]: 2025-11-23 21:09:03.407 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:09:03 np0005532763 nova_compute[231311]: 2025-11-23 21:09:03.407 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:09:03 np0005532763 nova_compute[231311]: 2025-11-23 21:09:03.408 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:09:03 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:09:03 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3740722688' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:09:03 np0005532763 nova_compute[231311]: 2025-11-23 21:09:03.895 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:09:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:03 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:09:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:03 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:09:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:03 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:09:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:04 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:09:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:04.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:04 np0005532763 nova_compute[231311]: 2025-11-23 21:09:04.122 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:09:04 np0005532763 nova_compute[231311]: 2025-11-23 21:09:04.124 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4978MB free_disk=59.94853591918945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:09:04 np0005532763 nova_compute[231311]: 2025-11-23 21:09:04.124 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:09:04 np0005532763 nova_compute[231311]: 2025-11-23 21:09:04.124 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:09:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:04 np0005532763 nova_compute[231311]: 2025-11-23 21:09:04.192 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:09:04 np0005532763 nova_compute[231311]: 2025-11-23 21:09:04.192 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:09:04 np0005532763 nova_compute[231311]: 2025-11-23 21:09:04.210 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:09:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:09:04 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/249885637' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:09:04 np0005532763 nova_compute[231311]: 2025-11-23 21:09:04.669 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:09:04 np0005532763 nova_compute[231311]: 2025-11-23 21:09:04.678 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:09:04 np0005532763 nova_compute[231311]: 2025-11-23 21:09:04.694 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:09:04 np0005532763 nova_compute[231311]: 2025-11-23 21:09:04.697 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:09:04 np0005532763 nova_compute[231311]: 2025-11-23 21:09:04.697 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.573s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:09:04 np0005532763 nova_compute[231311]: 2025-11-23 21:09:04.703 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:04.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:09:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:05 np0005532763 nova_compute[231311]: 2025-11-23 21:09:05.699 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:09:05 np0005532763 nova_compute[231311]: 2025-11-23 21:09:05.700 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:09:05 np0005532763 nova_compute[231311]: 2025-11-23 21:09:05.700 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:09:05 np0005532763 nova_compute[231311]: 2025-11-23 21:09:05.700 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:09:05 np0005532763 nova_compute[231311]: 2025-11-23 21:09:05.885 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:06.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:06.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:09:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2540947135' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:09:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:09:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2540947135' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:09:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:08.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:08.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:09:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:09:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:09:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:09 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:09:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:09 np0005532763 nova_compute[231311]: 2025-11-23 21:09:09.776 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:09:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:09:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:10.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:09:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:09:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:10.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:09:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:10 np0005532763 nova_compute[231311]: 2025-11-23 21:09:10.921 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:11 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:09:11.836 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3a:26:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:d5:4d:db:d5:2b'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:09:11 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:09:11.837 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 23 16:09:11 np0005532763 nova_compute[231311]: 2025-11-23 21:09:11.838 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:09:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:12.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:09:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:12.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:13 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:09:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:13 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:09:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:13 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:09:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:14 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:09:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:14.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:14 np0005532763 podman[236435]: 2025-11-23 21:09:14.200480176 +0000 UTC m=+0.083053123 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 16:09:14 np0005532763 nova_compute[231311]: 2025-11-23 21:09:14.779 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:09:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:14.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:09:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:09:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:15 np0005532763 nova_compute[231311]: 2025-11-23 21:09:15.962 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:16.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:09:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:16.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:09:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:18.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:09:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:18.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:09:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:18 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:09:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:18 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:09:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:18 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:09:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:19 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:09:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:19 np0005532763 nova_compute[231311]: 2025-11-23 21:09:19.328 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "227fff00-2bf2-4d7a-9ee7-ff4eaddc0880" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:09:19 np0005532763 nova_compute[231311]: 2025-11-23 21:09:19.328 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "227fff00-2bf2-4d7a-9ee7-ff4eaddc0880" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:09:19 np0005532763 nova_compute[231311]: 2025-11-23 21:09:19.343 231315 DEBUG nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 23 16:09:19 np0005532763 nova_compute[231311]: 2025-11-23 21:09:19.411 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:09:19 np0005532763 nova_compute[231311]: 2025-11-23 21:09:19.411 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:09:19 np0005532763 nova_compute[231311]: 2025-11-23 21:09:19.419 231315 DEBUG nova.virt.hardware [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 23 16:09:19 np0005532763 nova_compute[231311]: 2025-11-23 21:09:19.419 231315 INFO nova.compute.claims [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 23 16:09:19 np0005532763 nova_compute[231311]: 2025-11-23 21:09:19.520 231315 DEBUG oslo_concurrency.processutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:09:19 np0005532763 nova_compute[231311]: 2025-11-23 21:09:19.782 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:09:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:09:19 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2485713192' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.016 231315 DEBUG oslo_concurrency.processutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.024 231315 DEBUG nova.compute.provider_tree [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.040 231315 DEBUG nova.scheduler.client.report [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.071 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.072 231315 DEBUG nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 23 16:09:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:09:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:20.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.126 231315 DEBUG nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.127 231315 DEBUG nova.network.neutron [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 23 16:09:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.157 231315 INFO nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.174 231315 DEBUG nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.269 231315 DEBUG nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.271 231315 DEBUG nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.272 231315 INFO nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Creating image(s)#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.311 231315 DEBUG nova.storage.rbd_utils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.348 231315 DEBUG nova.storage.rbd_utils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.385 231315 DEBUG nova.storage.rbd_utils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.390 231315 DEBUG oslo_concurrency.processutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.417 231315 DEBUG nova.policy [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9fb5352c62684f2ba3a326a953a10dfe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '782593db60784ab8bff41fe87d72ff5f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.461 231315 DEBUG oslo_concurrency.processutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.463 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.464 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.464 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.502 231315 DEBUG nova.storage.rbd_utils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.507 231315 DEBUG oslo_concurrency.processutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.832 231315 DEBUG oslo_concurrency.processutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.325s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:09:20 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:09:20.840 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10e3bf57-dd2d-4b94-851f-925bcd297dde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:09:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:09:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:20.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:09:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:20 np0005532763 nova_compute[231311]: 2025-11-23 21:09:20.935 231315 DEBUG nova.storage.rbd_utils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] resizing rbd image 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Nov 23 16:09:21 np0005532763 nova_compute[231311]: 2025-11-23 21:09:21.013 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:21 np0005532763 nova_compute[231311]: 2025-11-23 21:09:21.086 231315 DEBUG nova.objects.instance [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'migration_context' on Instance uuid 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:09:21 np0005532763 nova_compute[231311]: 2025-11-23 21:09:21.096 231315 DEBUG nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 23 16:09:21 np0005532763 nova_compute[231311]: 2025-11-23 21:09:21.096 231315 DEBUG nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Ensure instance console log exists: /var/lib/nova/instances/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 23 16:09:21 np0005532763 nova_compute[231311]: 2025-11-23 21:09:21.097 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:09:21 np0005532763 nova_compute[231311]: 2025-11-23 21:09:21.097 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:09:21 np0005532763 nova_compute[231311]: 2025-11-23 21:09:21.098 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:09:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:21 np0005532763 podman[236650]: 2025-11-23 21:09:21.267734978 +0000 UTC m=+0.143885436 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 16:09:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:22.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:22 np0005532763 nova_compute[231311]: 2025-11-23 21:09:22.729 231315 DEBUG nova.network.neutron [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Successfully created port: 8a20c477-8372-4fe9-9b5f-ae2695da3fcd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 23 16:09:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:22.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:23 np0005532763 nova_compute[231311]: 2025-11-23 21:09:23.407 231315 DEBUG nova.network.neutron [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Successfully updated port: 8a20c477-8372-4fe9-9b5f-ae2695da3fcd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 23 16:09:23 np0005532763 nova_compute[231311]: 2025-11-23 21:09:23.427 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "refresh_cache-227fff00-2bf2-4d7a-9ee7-ff4eaddc0880" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:09:23 np0005532763 nova_compute[231311]: 2025-11-23 21:09:23.428 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquired lock "refresh_cache-227fff00-2bf2-4d7a-9ee7-ff4eaddc0880" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:09:23 np0005532763 nova_compute[231311]: 2025-11-23 21:09:23.428 231315 DEBUG nova.network.neutron [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 23 16:09:23 np0005532763 nova_compute[231311]: 2025-11-23 21:09:23.539 231315 DEBUG nova.compute.manager [req-20008053-7006-475c-bfef-914378ba67cd req-159e8e51-5ad1-41ef-a355-7691781dd9f9 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Received event network-changed-8a20c477-8372-4fe9-9b5f-ae2695da3fcd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:09:23 np0005532763 nova_compute[231311]: 2025-11-23 21:09:23.540 231315 DEBUG nova.compute.manager [req-20008053-7006-475c-bfef-914378ba67cd req-159e8e51-5ad1-41ef-a355-7691781dd9f9 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Refreshing instance network info cache due to event network-changed-8a20c477-8372-4fe9-9b5f-ae2695da3fcd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 23 16:09:23 np0005532763 nova_compute[231311]: 2025-11-23 21:09:23.540 231315 DEBUG oslo_concurrency.lockutils [req-20008053-7006-475c-bfef-914378ba67cd req-159e8e51-5ad1-41ef-a355-7691781dd9f9 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "refresh_cache-227fff00-2bf2-4d7a-9ee7-ff4eaddc0880" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:09:23 np0005532763 nova_compute[231311]: 2025-11-23 21:09:23.578 231315 DEBUG nova.network.neutron [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 23 16:09:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:24 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:09:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:24 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:09:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:24 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:09:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:24 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:09:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:09:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:24.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:09:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:24 np0005532763 podman[236704]: 2025-11-23 21:09:24.539697687 +0000 UTC m=+0.105035145 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2)
Nov 23 16:09:24 np0005532763 nova_compute[231311]: 2025-11-23 21:09:24.826 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:09:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:09:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:24.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:09:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.086 231315 DEBUG nova.network.neutron [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Updating instance_info_cache with network_info: [{"id": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "address": "fa:16:3e:54:70:bb", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a20c477-83", "ovs_interfaceid": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.104 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Releasing lock "refresh_cache-227fff00-2bf2-4d7a-9ee7-ff4eaddc0880" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.104 231315 DEBUG nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Instance network_info: |[{"id": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "address": "fa:16:3e:54:70:bb", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a20c477-83", "ovs_interfaceid": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.105 231315 DEBUG oslo_concurrency.lockutils [req-20008053-7006-475c-bfef-914378ba67cd req-159e8e51-5ad1-41ef-a355-7691781dd9f9 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquired lock "refresh_cache-227fff00-2bf2-4d7a-9ee7-ff4eaddc0880" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.105 231315 DEBUG nova.network.neutron [req-20008053-7006-475c-bfef-914378ba67cd req-159e8e51-5ad1-41ef-a355-7691781dd9f9 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Refreshing network info cache for port 8a20c477-8372-4fe9-9b5f-ae2695da3fcd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.110 231315 DEBUG nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Start _get_guest_xml network_info=[{"id": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "address": "fa:16:3e:54:70:bb", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a20c477-83", "ovs_interfaceid": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-23T21:05:43Z,direct_url=<?>,disk_format='qcow2',id=3c45fa6c-8a99-4359-a34e-d89f4e1e77d0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3f8fb5175f85402ba20cf9c6989d47cf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-23T21:05:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'encryption_options': None, 'size': 0, 'image_id': '3c45fa6c-8a99-4359-a34e-d89f4e1e77d0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.118 231315 WARNING nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.131 231315 DEBUG nova.virt.libvirt.host [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.132 231315 DEBUG nova.virt.libvirt.host [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 23 16:09:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.136 231315 DEBUG nova.virt.libvirt.host [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.137 231315 DEBUG nova.virt.libvirt.host [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.138 231315 DEBUG nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.138 231315 DEBUG nova.virt.hardware [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-23T21:05:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='56044b93-2979-48aa-b67f-c37e1b489306',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-23T21:05:43Z,direct_url=<?>,disk_format='qcow2',id=3c45fa6c-8a99-4359-a34e-d89f4e1e77d0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3f8fb5175f85402ba20cf9c6989d47cf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-23T21:05:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.139 231315 DEBUG nova.virt.hardware [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.140 231315 DEBUG nova.virt.hardware [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.140 231315 DEBUG nova.virt.hardware [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.140 231315 DEBUG nova.virt.hardware [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.141 231315 DEBUG nova.virt.hardware [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.141 231315 DEBUG nova.virt.hardware [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.142 231315 DEBUG nova.virt.hardware [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.142 231315 DEBUG nova.virt.hardware [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.143 231315 DEBUG nova.virt.hardware [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.143 231315 DEBUG nova.virt.hardware [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.149 231315 DEBUG oslo_concurrency.processutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:09:25 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 16:09:25 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1642653001' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.655 231315 DEBUG oslo_concurrency.processutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.701 231315 DEBUG nova.storage.rbd_utils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:09:25 np0005532763 nova_compute[231311]: 2025-11-23 21:09:25.711 231315 DEBUG oslo_concurrency.processutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:09:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.016 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:09:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:26.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:09:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:26 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 16:09:26 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1869792718' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.207 231315 DEBUG oslo_concurrency.processutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.209 231315 DEBUG nova.virt.libvirt.vif [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T21:09:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-626843533',display_name='tempest-TestNetworkBasicOps-server-626843533',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-626843533',id=4,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBzFKgfz1QVXAYBgw9WYLDmImQIyNZIUJvYaUSeZsmfvEoA7CUytAymkLL0tqBwm8cJVrzUl6E9R6D/qdooFrc51SiAGOyjiHvRBM9c3gaFOzuWbTw1Aa3lZ7MmCQiSUEQ==',key_name='tempest-TestNetworkBasicOps-1952591884',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-mabh37mo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-23T21:09:20Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=227fff00-2bf2-4d7a-9ee7-ff4eaddc0880,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "address": "fa:16:3e:54:70:bb", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a20c477-83", "ovs_interfaceid": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.210 231315 DEBUG nova.network.os_vif_util [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "address": "fa:16:3e:54:70:bb", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a20c477-83", "ovs_interfaceid": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.211 231315 DEBUG nova.network.os_vif_util [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:70:bb,bridge_name='br-int',has_traffic_filtering=True,id=8a20c477-8372-4fe9-9b5f-ae2695da3fcd,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a20c477-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.212 231315 DEBUG nova.objects.instance [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'pci_devices' on Instance uuid 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.230 231315 DEBUG nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] End _get_guest_xml xml=<domain type="kvm">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  <uuid>227fff00-2bf2-4d7a-9ee7-ff4eaddc0880</uuid>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  <name>instance-00000004</name>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  <memory>131072</memory>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  <vcpu>1</vcpu>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  <metadata>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <nova:name>tempest-TestNetworkBasicOps-server-626843533</nova:name>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <nova:creationTime>2025-11-23 21:09:25</nova:creationTime>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <nova:flavor name="m1.nano">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        <nova:memory>128</nova:memory>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        <nova:disk>1</nova:disk>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        <nova:swap>0</nova:swap>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        <nova:ephemeral>0</nova:ephemeral>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        <nova:vcpus>1</nova:vcpus>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      </nova:flavor>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <nova:owner>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        <nova:user uuid="9fb5352c62684f2ba3a326a953a10dfe">tempest-TestNetworkBasicOps-1975357669-project-member</nova:user>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        <nova:project uuid="782593db60784ab8bff41fe87d72ff5f">tempest-TestNetworkBasicOps-1975357669</nova:project>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      </nova:owner>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <nova:root type="image" uuid="3c45fa6c-8a99-4359-a34e-d89f4e1e77d0"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <nova:ports>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        <nova:port uuid="8a20c477-8372-4fe9-9b5f-ae2695da3fcd">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        </nova:port>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      </nova:ports>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    </nova:instance>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  </metadata>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  <sysinfo type="smbios">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <system>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <entry name="manufacturer">RDO</entry>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <entry name="product">OpenStack Compute</entry>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <entry name="serial">227fff00-2bf2-4d7a-9ee7-ff4eaddc0880</entry>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <entry name="uuid">227fff00-2bf2-4d7a-9ee7-ff4eaddc0880</entry>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <entry name="family">Virtual Machine</entry>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    </system>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  </sysinfo>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  <os>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <boot dev="hd"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <smbios mode="sysinfo"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  </os>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  <features>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <acpi/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <apic/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <vmcoreinfo/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  </features>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  <clock offset="utc">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <timer name="pit" tickpolicy="delay"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <timer name="hpet" present="no"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  </clock>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  <cpu mode="host-model" match="exact">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <topology sockets="1" cores="1" threads="1"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  </cpu>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  <devices>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <disk type="network" device="disk">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <driver type="raw" cache="none"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <source protocol="rbd" name="vms/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_disk">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        <host name="192.168.122.100" port="6789"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        <host name="192.168.122.102" port="6789"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        <host name="192.168.122.101" port="6789"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      </source>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <auth username="openstack">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        <secret type="ceph" uuid="03808be8-ae4a-5548-82e6-4a294f1bc627"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      </auth>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <target dev="vda" bus="virtio"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    </disk>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <disk type="network" device="cdrom">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <driver type="raw" cache="none"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <source protocol="rbd" name="vms/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_disk.config">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        <host name="192.168.122.100" port="6789"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        <host name="192.168.122.102" port="6789"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        <host name="192.168.122.101" port="6789"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      </source>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <auth username="openstack">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:        <secret type="ceph" uuid="03808be8-ae4a-5548-82e6-4a294f1bc627"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      </auth>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <target dev="sda" bus="sata"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    </disk>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <interface type="ethernet">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <mac address="fa:16:3e:54:70:bb"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <model type="virtio"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <driver name="vhost" rx_queue_size="512"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <mtu size="1442"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <target dev="tap8a20c477-83"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    </interface>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <serial type="pty">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <log file="/var/lib/nova/instances/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880/console.log" append="off"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    </serial>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <video>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <model type="virtio"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    </video>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <input type="tablet" bus="usb"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <rng model="virtio">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <backend model="random">/dev/urandom</backend>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    </rng>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <controller type="usb" index="0"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    <memballoon model="virtio">
Nov 23 16:09:26 np0005532763 nova_compute[231311]:      <stats period="10"/>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:    </memballoon>
Nov 23 16:09:26 np0005532763 nova_compute[231311]:  </devices>
Nov 23 16:09:26 np0005532763 nova_compute[231311]: </domain>
Nov 23 16:09:26 np0005532763 nova_compute[231311]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.232 231315 DEBUG nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Preparing to wait for external event network-vif-plugged-8a20c477-8372-4fe9-9b5f-ae2695da3fcd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.233 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "227fff00-2bf2-4d7a-9ee7-ff4eaddc0880-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.233 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "227fff00-2bf2-4d7a-9ee7-ff4eaddc0880-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.234 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "227fff00-2bf2-4d7a-9ee7-ff4eaddc0880-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.235 231315 DEBUG nova.virt.libvirt.vif [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T21:09:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-626843533',display_name='tempest-TestNetworkBasicOps-server-626843533',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-626843533',id=4,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBzFKgfz1QVXAYBgw9WYLDmImQIyNZIUJvYaUSeZsmfvEoA7CUytAymkLL0tqBwm8cJVrzUl6E9R6D/qdooFrc51SiAGOyjiHvRBM9c3gaFOzuWbTw1Aa3lZ7MmCQiSUEQ==',key_name='tempest-TestNetworkBasicOps-1952591884',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-mabh37mo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-23T21:09:20Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=227fff00-2bf2-4d7a-9ee7-ff4eaddc0880,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "address": "fa:16:3e:54:70:bb", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a20c477-83", "ovs_interfaceid": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.235 231315 DEBUG nova.network.os_vif_util [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "address": "fa:16:3e:54:70:bb", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a20c477-83", "ovs_interfaceid": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.237 231315 DEBUG nova.network.os_vif_util [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:70:bb,bridge_name='br-int',has_traffic_filtering=True,id=8a20c477-8372-4fe9-9b5f-ae2695da3fcd,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a20c477-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.237 231315 DEBUG os_vif [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:70:bb,bridge_name='br-int',has_traffic_filtering=True,id=8a20c477-8372-4fe9-9b5f-ae2695da3fcd,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a20c477-83') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.239 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.239 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.240 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.250 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.250 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8a20c477-83, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.251 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8a20c477-83, col_values=(('external_ids', {'iface-id': '8a20c477-8372-4fe9-9b5f-ae2695da3fcd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:54:70:bb', 'vm-uuid': '227fff00-2bf2-4d7a-9ee7-ff4eaddc0880'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:09:26 np0005532763 NetworkManager[48849]: <info>  [1763932166.2549] manager: (tap8a20c477-83): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.254 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.258 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.265 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.267 231315 INFO os_vif [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:70:bb,bridge_name='br-int',has_traffic_filtering=True,id=8a20c477-8372-4fe9-9b5f-ae2695da3fcd,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a20c477-83')#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.329 231315 DEBUG nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.329 231315 DEBUG nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.329 231315 DEBUG nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No VIF found with MAC fa:16:3e:54:70:bb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.330 231315 INFO nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Using config drive#033[00m
Nov 23 16:09:26 np0005532763 nova_compute[231311]: 2025-11-23 21:09:26.360 231315 DEBUG nova.storage.rbd_utils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:09:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:26.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.162 231315 INFO nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Creating config drive at /var/lib/nova/instances/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880/disk.config#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.171 231315 DEBUG oslo_concurrency.processutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5l7sn4b2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.310 231315 DEBUG oslo_concurrency.processutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5l7sn4b2" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.352 231315 DEBUG nova.storage.rbd_utils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.357 231315 DEBUG oslo_concurrency.processutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880/disk.config 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.558 231315 DEBUG oslo_concurrency.processutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880/disk.config 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.201s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.559 231315 INFO nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Deleting local config drive /var/lib/nova/instances/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880/disk.config because it was imported into RBD.#033[00m
Nov 23 16:09:27 np0005532763 virtqemud[230850]: Cannot recv data: Connection reset by peer
Nov 23 16:09:27 np0005532763 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.586 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.595 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Error launching a defined domain with XML: <domain type='kvm'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  <name>instance-00000004</name>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  <uuid>227fff00-2bf2-4d7a-9ee7-ff4eaddc0880</uuid>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  <metadata>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <nova:name>tempest-TestNetworkBasicOps-server-626843533</nova:name>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <nova:creationTime>2025-11-23 21:09:25</nova:creationTime>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <nova:flavor name="m1.nano">
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <nova:memory>128</nova:memory>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <nova:disk>1</nova:disk>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <nova:swap>0</nova:swap>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <nova:ephemeral>0</nova:ephemeral>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <nova:vcpus>1</nova:vcpus>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      </nova:flavor>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <nova:owner>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <nova:user uuid="9fb5352c62684f2ba3a326a953a10dfe">tempest-TestNetworkBasicOps-1975357669-project-member</nova:user>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <nova:project uuid="782593db60784ab8bff41fe87d72ff5f">tempest-TestNetworkBasicOps-1975357669</nova:project>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      </nova:owner>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <nova:root type="image" uuid="3c45fa6c-8a99-4359-a34e-d89f4e1e77d0"/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <nova:ports>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <nova:port uuid="8a20c477-8372-4fe9-9b5f-ae2695da3fcd">
Nov 23 16:09:27 np0005532763 nova_compute[231311]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        </nova:port>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      </nova:ports>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </nova:instance>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  </metadata>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  <memory unit='KiB'>131072</memory>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  <vcpu placement='static'>1</vcpu>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  <sysinfo type='smbios'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <system>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <entry name='manufacturer'>RDO</entry>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <entry name='product'>OpenStack Compute</entry>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <entry name='serial'>227fff00-2bf2-4d7a-9ee7-ff4eaddc0880</entry>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <entry name='uuid'>227fff00-2bf2-4d7a-9ee7-ff4eaddc0880</entry>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <entry name='family'>Virtual Machine</entry>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </system>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  </sysinfo>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  <os>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <boot dev='hd'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <smbios mode='sysinfo'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  </os>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  <features>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <acpi/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <apic/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <vmcoreinfo state='on'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  </features>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  <cpu mode='host-model' check='partial'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  </cpu>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  <clock offset='utc'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <timer name='pit' tickpolicy='delay'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <timer name='hpet' present='no'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  </clock>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  <on_poweroff>destroy</on_poweroff>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  <on_reboot>restart</on_reboot>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  <on_crash>destroy</on_crash>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  <devices>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <disk type='network' device='disk'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <driver name='qemu' type='raw' cache='none'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <auth username='openstack'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <secret type='ceph' uuid='03808be8-ae4a-5548-82e6-4a294f1bc627'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      </auth>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <source protocol='rbd' name='vms/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_disk'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <host name='192.168.122.100' port='6789'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <host name='192.168.122.102' port='6789'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <host name='192.168.122.101' port='6789'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      </source>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target dev='vda' bus='virtio'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </disk>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <disk type='network' device='cdrom'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <driver name='qemu' type='raw' cache='none'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <auth username='openstack'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <secret type='ceph' uuid='03808be8-ae4a-5548-82e6-4a294f1bc627'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      </auth>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <source protocol='rbd' name='vms/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_disk.config'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <host name='192.168.122.100' port='6789'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <host name='192.168.122.102' port='6789'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <host name='192.168.122.101' port='6789'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      </source>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target dev='sda' bus='sata'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <readonly/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </disk>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='0' model='pcie-root'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='1' port='0x10'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='2' port='0x11'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='3' port='0x12'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='4' port='0x13'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='5' port='0x14'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='6' port='0x15'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='7' port='0x16'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='8' port='0x17'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='9' port='0x18'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='10' port='0x19'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='11' port='0x1a'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='12' port='0x1b'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='13' port='0x1c'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='14' port='0x1d'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='15' port='0x1e'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='16' port='0x1f'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='17' port='0x20'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='18' port='0x21'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='19' port='0x22'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='20' port='0x23'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='21' port='0x24'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='22' port='0x25'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='23' port='0x26'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='24' port='0x27'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-root-port'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target chassis='25' port='0x28'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model name='pcie-pci-bridge'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <controller type='sata' index='0'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </controller>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <interface type='ethernet'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <mac address='fa:16:3e:54:70:bb'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target dev='tap8a20c477-83'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model type='virtio'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <driver name='vhost' rx_queue_size='512'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <mtu size='1442'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </interface>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <serial type='pty'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <log file='/var/lib/nova/instances/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880/console.log' append='off'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target type='isa-serial' port='0'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:        <model name='isa-serial'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      </target>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </serial>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <console type='pty'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <log file='/var/lib/nova/instances/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880/console.log' append='off'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <target type='serial' port='0'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </console>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <input type='tablet' bus='usb'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='usb' bus='0' port='1'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </input>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <input type='mouse' bus='ps2'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <input type='keyboard' bus='ps2'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <graphics type='vnc' port='-1' autoport='yes' listen='::0'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <listen type='address' address='::0'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </graphics>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <audio id='1' type='none'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <video>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <model type='virtio' heads='1' primary='yes'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </video>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <watchdog model='itco' action='reset'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <memballoon model='virtio'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <stats period='10'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </memballoon>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    <rng model='virtio'>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <backend model='random'>/dev/urandom</backend>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:    </rng>
Nov 23 16:09:27 np0005532763 nova_compute[231311]:  </devices>
Nov 23 16:09:27 np0005532763 nova_compute[231311]: </domain>
Nov 23 16:09:27 np0005532763 nova_compute[231311]: : libvirt.libvirtError: Cannot recv data: Connection reset by peer
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest Traceback (most recent call last):
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py", line 165, in launch
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest     return self._domain.createWithFlags(flags)
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 193, in doit
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest     result = proxy_call(self._autowrap, f, *args, **kwargs)
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 151, in proxy_call
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest     rv = execute(f, *args, **kwargs)
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 132, in execute
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest     six.reraise(c, e, tb)
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest   File "/usr/lib/python3.9/site-packages/six.py", line 709, in reraise
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest     raise value
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 86, in tworker
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest     rv = meth(*args, **kwargs)
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest   File "/usr/lib64/python3.9/site-packages/libvirt.py", line 1426, in createWithFlags
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest     raise libvirtError('virDomainCreateWithFlags() failed')
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest libvirt.libvirtError: Cannot recv data: Connection reset by peer
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.602 231315 ERROR nova.virt.libvirt.guest #033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.609 231315 ERROR nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Failed to start libvirt guest: libvirt.libvirtError: Cannot recv data: Connection reset by peer#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.611 231315 DEBUG nova.virt.libvirt.vif [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T21:09:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-626843533',display_name='tempest-TestNetworkBasicOps-server-626843533',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-626843533',id=4,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBzFKgfz1QVXAYBgw9WYLDmImQIyNZIUJvYaUSeZsmfvEoA7CUytAymkLL0tqBwm8cJVrzUl6E9R6D/qdooFrc51SiAGOyjiHvRBM9c3gaFOzuWbTw1Aa3lZ7MmCQiSUEQ==',key_name='tempest-TestNetworkBasicOps-1952591884',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-mabh37mo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-23T21:09:20Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=227fff00-2bf2-4d7a-9ee7-ff4eaddc0880,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "address": "fa:16:3e:54:70:bb", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a20c477-83", "ovs_interfaceid": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.612 231315 DEBUG nova.network.os_vif_util [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "address": "fa:16:3e:54:70:bb", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a20c477-83", "ovs_interfaceid": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.613 231315 DEBUG nova.network.os_vif_util [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:70:bb,bridge_name='br-int',has_traffic_filtering=True,id=8a20c477-8372-4fe9-9b5f-ae2695da3fcd,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a20c477-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.614 231315 DEBUG os_vif [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:70:bb,bridge_name='br-int',has_traffic_filtering=True,id=8a20c477-8372-4fe9-9b5f-ae2695da3fcd,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a20c477-83') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.617 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.617 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8a20c477-83, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.620 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.622 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:27 np0005532763 nova_compute[231311]: 2025-11-23 21:09:27.625 231315 INFO os_vif [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:70:bb,bridge_name='br-int',has_traffic_filtering=True,id=8a20c477-8372-4fe9-9b5f-ae2695da3fcd,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a20c477-83')#033[00m
Nov 23 16:09:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.051 231315 INFO nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Deleting instance files /var/lib/nova/instances/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_del#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.052 231315 INFO nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Deletion of /var/lib/nova/instances/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_del complete#033[00m
Nov 23 16:09:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:28.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Instance failed to spawn: libvirt.libvirtError: Cannot recv data: Connection reset by peer
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Traceback (most recent call last):
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2864, in _build_resources
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     yield resources
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     self.driver.spawn(context, instance, image_meta,
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4411, in spawn
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     self._create_guest_with_network(
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 7773, in _create_guest_with_network
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     self._cleanup(
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     self.force_reraise()
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     raise self.value
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 7750, in _create_guest_with_network
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     guest = self._create_guest(
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 7689, in _create_guest
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     guest.launch(pause=pause)
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py", line 168, in launch
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     LOG.exception('Error launching a defined domain with XML: %s',
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     self.force_reraise()
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     raise self.value
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py", line 165, in launch
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     return self._domain.createWithFlags(flags)
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 193, in doit
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     result = proxy_call(self._autowrap, f, *args, **kwargs)
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 151, in proxy_call
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     rv = execute(f, *args, **kwargs)
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 132, in execute
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     six.reraise(c, e, tb)
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/six.py", line 709, in reraise
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     raise value
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 86, in tworker
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     rv = meth(*args, **kwargs)
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib64/python3.9/site-packages/libvirt.py", line 1426, in createWithFlags
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     raise libvirtError('virDomainCreateWithFlags() failed')
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] libvirt.libvirtError: Cannot recv data: Connection reset by peer
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.118 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] #033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.123 231315 INFO nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Terminating instance#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.125 231315 DEBUG nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.129 231315 DEBUG nova.virt.libvirt.driver [-] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] During wait destroy, instance disappeared. _wait_for_destroy /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1527#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.129 231315 INFO nova.virt.libvirt.driver [-] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Instance destroyed successfully.#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.130 231315 DEBUG nova.virt.libvirt.vif [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2025-11-23T21:09:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-626843533',display_name='tempest-TestNetworkBasicOps-server-626843533',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-626843533',id=4,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBzFKgfz1QVXAYBgw9WYLDmImQIyNZIUJvYaUSeZsmfvEoA7CUytAymkLL0tqBwm8cJVrzUl6E9R6D/qdooFrc51SiAGOyjiHvRBM9c3gaFOzuWbTw1Aa3lZ7MmCQiSUEQ==',key_name='tempest-TestNetworkBasicOps-1952591884',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-mabh37mo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-23T21:09:20Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=227fff00-2bf2-4d7a-9ee7-ff4eaddc0880,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "address": "fa:16:3e:54:70:bb", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a20c477-83", "ovs_interfaceid": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.130 231315 DEBUG nova.network.os_vif_util [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "address": "fa:16:3e:54:70:bb", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a20c477-83", "ovs_interfaceid": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.131 231315 DEBUG nova.network.os_vif_util [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:70:bb,bridge_name='br-int',has_traffic_filtering=True,id=8a20c477-8372-4fe9-9b5f-ae2695da3fcd,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a20c477-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.131 231315 DEBUG os_vif [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:70:bb,bridge_name='br-int',has_traffic_filtering=True,id=8a20c477-8372-4fe9-9b5f-ae2695da3fcd,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a20c477-83') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.133 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.134 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8a20c477-83, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.134 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 23 16:09:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.137 231315 INFO os_vif [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:70:bb,bridge_name='br-int',has_traffic_filtering=True,id=8a20c477-8372-4fe9-9b5f-ae2695da3fcd,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a20c477-83')#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.175 231315 INFO nova.virt.libvirt.driver [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Deletion of /var/lib/nova/instances/227fff00-2bf2-4d7a-9ee7-ff4eaddc0880_del complete#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.262 231315 INFO nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Took 0.14 seconds to destroy the instance on the hypervisor.#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.265 231315 DEBUG nova.compute.claims [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Aborting claim: <nova.compute.claims.Claim object at 0x7ff8d41a55b0> abort /usr/lib/python3.9/site-packages/nova/compute/claims.py:85#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.267 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.267 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.281 231315 DEBUG nova.network.neutron [req-20008053-7006-475c-bfef-914378ba67cd req-159e8e51-5ad1-41ef-a355-7691781dd9f9 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Updated VIF entry in instance network info cache for port 8a20c477-8372-4fe9-9b5f-ae2695da3fcd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.282 231315 DEBUG nova.network.neutron [req-20008053-7006-475c-bfef-914378ba67cd req-159e8e51-5ad1-41ef-a355-7691781dd9f9 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Updating instance_info_cache with network_info: [{"id": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "address": "fa:16:3e:54:70:bb", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a20c477-83", "ovs_interfaceid": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.308 231315 DEBUG oslo_concurrency.lockutils [req-20008053-7006-475c-bfef-914378ba67cd req-159e8e51-5ad1-41ef-a355-7691781dd9f9 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Releasing lock "refresh_cache-227fff00-2bf2-4d7a-9ee7-ff4eaddc0880" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.392 231315 DEBUG oslo_concurrency.processutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:09:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:28.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:28 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:09:28 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2717229072' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.964 231315 DEBUG oslo_concurrency.processutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.571s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.969 231315 DEBUG nova.compute.provider_tree [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:09:28 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.981 231315 DEBUG nova.scheduler.client.report [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:28.999 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:09:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:29 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:09:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:29 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:09:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:29 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Failed to build and run instance: libvirt.libvirtError: Cannot recv data: Connection reset by peer
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Traceback (most recent call last):
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 2611, in _build_and_run_instance
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     self.driver.spawn(context, instance, image_meta,
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 4411, in spawn
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     self._create_guest_with_network(
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 7773, in _create_guest_with_network
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     self._cleanup(
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     self.force_reraise()
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     raise self.value
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 7750, in _create_guest_with_network
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     guest = self._create_guest(
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 7689, in _create_guest
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     guest.launch(pause=pause)
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py", line 168, in launch
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     LOG.exception('Error launching a defined domain with XML: %s',
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     self.force_reraise()
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     raise self.value
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py", line 165, in launch
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     return self._domain.createWithFlags(flags)
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 193, in doit
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     result = proxy_call(self._autowrap, f, *args, **kwargs)
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 151, in proxy_call
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     rv = execute(f, *args, **kwargs)
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 132, in execute
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     six.reraise(c, e, tb)
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/six.py", line 709, in reraise
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     raise value
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 86, in tworker
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     rv = meth(*args, **kwargs)
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]   File "/usr/lib64/python3.9/site-packages/libvirt.py", line 1426, in createWithFlags
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880]     raise libvirtError('virDomainCreateWithFlags() failed')
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] libvirt.libvirtError: Cannot recv data: Connection reset by peer
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.000 231315 ERROR nova.compute.manager [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] #033[00m
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.001 231315 DEBUG nova.compute.utils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] libvirtError notify_about_instance_usage /usr/lib/python3.9/site-packages/nova/compute/utils.py:430#033[00m
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.002 231315 DEBUG nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Build of instance 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880 was re-scheduled: Cannot recv data: Connection reset by peer _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2450#033[00m
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.002 231315 DEBUG nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Unplugging VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:2976#033[00m
Nov 23 16:09:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:29 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.003 231315 DEBUG nova.virt.libvirt.vif [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='',created_at=2025-11-23T21:09:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-626843533',display_name='tempest-TestNetworkBasicOps-server-626843533',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host=None,hostname='tempest-testnetworkbasicops-server-626843533',id=4,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBzFKgfz1QVXAYBgw9WYLDmImQIyNZIUJvYaUSeZsmfvEoA7CUytAymkLL0tqBwm8cJVrzUl6E9R6D/qdooFrc51SiAGOyjiHvRBM9c3gaFOzuWbTw1Aa3lZ7MmCQiSUEQ==',key_name='tempest-TestNetworkBasicOps-1952591884',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node=None,numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-mabh37mo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-23T21:09:28Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=227fff00-2bf2-4d7a-9ee7-ff4eaddc0880,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "address": "fa:16:3e:54:70:bb", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a20c477-83", "ovs_interfaceid": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.003 231315 DEBUG nova.network.os_vif_util [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "address": "fa:16:3e:54:70:bb", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8a20c477-83", "ovs_interfaceid": "8a20c477-8372-4fe9-9b5f-ae2695da3fcd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.003 231315 DEBUG nova.network.os_vif_util [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:70:bb,bridge_name='br-int',has_traffic_filtering=True,id=8a20c477-8372-4fe9-9b5f-ae2695da3fcd,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a20c477-83') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.004 231315 DEBUG os_vif [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:70:bb,bridge_name='br-int',has_traffic_filtering=True,id=8a20c477-8372-4fe9-9b5f-ae2695da3fcd,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a20c477-83') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.005 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.005 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8a20c477-83, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.005 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.008 231315 INFO os_vif [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:70:bb,bridge_name='br-int',has_traffic_filtering=True,id=8a20c477-8372-4fe9-9b5f-ae2695da3fcd,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8a20c477-83')#033[00m
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.009 231315 DEBUG nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Unplugged VIFs for instance _cleanup_allocated_networks /usr/lib/python3.9/site-packages/nova/compute/manager.py:3012#033[00m
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.009 231315 DEBUG nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 23 16:09:29 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.009 231315 DEBUG nova.network.neutron [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 23 16:09:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:09:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:30 np0005532763 nova_compute[231311]: 2025-11-23 21:09:29.999 231315 DEBUG nova.network.neutron [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:09:30 np0005532763 nova_compute[231311]: 2025-11-23 21:09:30.018 231315 INFO nova.compute.manager [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880] Took 1.01 seconds to deallocate network for instance.#033[00m
Nov 23 16:09:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:30.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:30 np0005532763 nova_compute[231311]: 2025-11-23 21:09:30.180 231315 INFO nova.scheduler.client.report [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Deleted allocations for instance 227fff00-2bf2-4d7a-9ee7-ff4eaddc0880#033[00m
Nov 23 16:09:30 np0005532763 nova_compute[231311]: 2025-11-23 21:09:30.236 231315 DEBUG oslo_concurrency.lockutils [None req-be412014-dc3a-45b4-8042-20983b5c52ac 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "227fff00-2bf2-4d7a-9ee7-ff4eaddc0880" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.907s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:09:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:30.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:31 np0005532763 nova_compute[231311]: 2025-11-23 21:09:31.040 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:31 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:09:31 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:09:31 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:09:31 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:09:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:32.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:32 np0005532763 nova_compute[231311]: 2025-11-23 21:09:32.622 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:32.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:33 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:09:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:33 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:09:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:33 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:09:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:34 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:09:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:34.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:09:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:34.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:36 np0005532763 nova_compute[231311]: 2025-11-23 21:09:36.043 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:36.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:36 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:09:36 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:09:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:36.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:37 np0005532763 nova_compute[231311]: 2025-11-23 21:09:37.625 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:38.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:38.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:38 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:09:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:38 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:09:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:38 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:09:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:39 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:09:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:09:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:40.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:09:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:40.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:09:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:41 np0005532763 nova_compute[231311]: 2025-11-23 21:09:41.085 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:42.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:42 np0005532763 nova_compute[231311]: 2025-11-23 21:09:42.627 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:42.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:43 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:09:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:43 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:09:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:43 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:09:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:44 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:09:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:44.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:44 np0005532763 podman[237066]: 2025-11-23 21:09:44.666000749 +0000 UTC m=+0.106121325 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 23 16:09:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:09:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:44.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:46 np0005532763 nova_compute[231311]: 2025-11-23 21:09:46.125 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:46.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:46.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:47 np0005532763 nova_compute[231311]: 2025-11-23 21:09:47.631 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:09:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:48.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:09:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:09:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:48.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:09:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:48 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:09:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:48 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:09:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:48 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:09:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:49 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:09:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:09:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:50.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:09:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:50.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:09:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:51 np0005532763 nova_compute[231311]: 2025-11-23 21:09:51.127 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:09:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:52.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:09:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:09:52.223 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:09:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:09:52.224 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:09:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:09:52.224 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:09:52 np0005532763 podman[237097]: 2025-11-23 21:09:52.316042042 +0000 UTC m=+0.186053899 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 16:09:52 np0005532763 nova_compute[231311]: 2025-11-23 21:09:52.634 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:52.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:53 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:09:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:53 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:09:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:53 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:09:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:54 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:09:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:54.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:09:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:54.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:55 np0005532763 podman[237128]: 2025-11-23 21:09:55.205289137 +0000 UTC m=+0.085057898 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:09:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:56.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:56 np0005532763 nova_compute[231311]: 2025-11-23 21:09:56.164 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:56.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:57 np0005532763 nova_compute[231311]: 2025-11-23 21:09:57.636 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:09:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:09:58.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:09:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:09:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:09:58.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:09:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:59 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:09:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:59 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:09:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:59 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:09:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:09:59 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:09:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:09:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:09:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:09:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:09:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:00.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:00 np0005532763 nova_compute[231311]: 2025-11-23 21:10:00.381 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:10:00 np0005532763 ceph-mon[75752]: overall HEALTH_OK
Nov 23 16:10:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:00.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:01 np0005532763 nova_compute[231311]: 2025-11-23 21:10:01.207 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:02.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:02 np0005532763 nova_compute[231311]: 2025-11-23 21:10:02.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:10:02 np0005532763 nova_compute[231311]: 2025-11-23 21:10:02.639 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:02.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:03 np0005532763 nova_compute[231311]: 2025-11-23 21:10:03.381 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "28677820-c1a2-4bbc-91d4-f2d7448eee33" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:10:03 np0005532763 nova_compute[231311]: 2025-11-23 21:10:03.381 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "28677820-c1a2-4bbc-91d4-f2d7448eee33" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:10:03 np0005532763 nova_compute[231311]: 2025-11-23 21:10:03.398 231315 DEBUG nova.compute.manager [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 23 16:10:03 np0005532763 nova_compute[231311]: 2025-11-23 21:10:03.476 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:10:03 np0005532763 nova_compute[231311]: 2025-11-23 21:10:03.477 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:10:03 np0005532763 nova_compute[231311]: 2025-11-23 21:10:03.484 231315 DEBUG nova.virt.hardware [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 23 16:10:03 np0005532763 nova_compute[231311]: 2025-11-23 21:10:03.485 231315 INFO nova.compute.claims [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 23 16:10:03 np0005532763 nova_compute[231311]: 2025-11-23 21:10:03.582 231315 DEBUG oslo_concurrency.processutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:10:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:03 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:10:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:04 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:10:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:04 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:10:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:04 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:10:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:10:04 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1813209308' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.066 231315 DEBUG oslo_concurrency.processutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.074 231315 DEBUG nova.compute.provider_tree [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.090 231315 DEBUG nova.scheduler.client.report [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.111 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.112 231315 DEBUG nova.compute.manager [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 23 16:10:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:04.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.158 231315 DEBUG nova.compute.manager [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.159 231315 DEBUG nova.network.neutron [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.179 231315 INFO nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.197 231315 DEBUG nova.compute.manager [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.307 231315 DEBUG nova.compute.manager [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.309 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.310 231315 INFO nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Creating image(s)#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.350 231315 DEBUG nova.storage.rbd_utils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 28677820-c1a2-4bbc-91d4-f2d7448eee33_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.393 231315 DEBUG nova.storage.rbd_utils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 28677820-c1a2-4bbc-91d4-f2d7448eee33_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.441 231315 DEBUG nova.storage.rbd_utils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 28677820-c1a2-4bbc-91d4-f2d7448eee33_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.448 231315 DEBUG oslo_concurrency.processutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.484 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.486 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.487 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.487 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.517 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.518 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.519 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.520 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.562 231315 DEBUG oslo_concurrency.processutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 --force-share --output=json" returned: 0 in 0.114s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.564 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.565 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.566 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.605 231315 DEBUG nova.storage.rbd_utils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 28677820-c1a2-4bbc-91d4-f2d7448eee33_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.611 231315 DEBUG oslo_concurrency.processutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 28677820-c1a2-4bbc-91d4-f2d7448eee33_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:10:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:10:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:04.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:04 np0005532763 nova_compute[231311]: 2025-11-23 21:10:04.938 231315 DEBUG oslo_concurrency.processutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 28677820-c1a2-4bbc-91d4-f2d7448eee33_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.327s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:10:05 np0005532763 nova_compute[231311]: 2025-11-23 21:10:05.030 231315 DEBUG nova.policy [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9fb5352c62684f2ba3a326a953a10dfe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '782593db60784ab8bff41fe87d72ff5f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 23 16:10:05 np0005532763 nova_compute[231311]: 2025-11-23 21:10:05.040 231315 DEBUG nova.storage.rbd_utils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] resizing rbd image 28677820-c1a2-4bbc-91d4-f2d7448eee33_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Nov 23 16:10:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:05 np0005532763 nova_compute[231311]: 2025-11-23 21:10:05.171 231315 DEBUG nova.objects.instance [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'migration_context' on Instance uuid 28677820-c1a2-4bbc-91d4-f2d7448eee33 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:10:05 np0005532763 nova_compute[231311]: 2025-11-23 21:10:05.192 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 23 16:10:05 np0005532763 nova_compute[231311]: 2025-11-23 21:10:05.193 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Ensure instance console log exists: /var/lib/nova/instances/28677820-c1a2-4bbc-91d4-f2d7448eee33/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 23 16:10:05 np0005532763 nova_compute[231311]: 2025-11-23 21:10:05.194 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:10:05 np0005532763 nova_compute[231311]: 2025-11-23 21:10:05.195 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:10:05 np0005532763 nova_compute[231311]: 2025-11-23 21:10:05.195 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:10:05 np0005532763 nova_compute[231311]: 2025-11-23 21:10:05.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:10:05 np0005532763 nova_compute[231311]: 2025-11-23 21:10:05.400 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:10:05 np0005532763 nova_compute[231311]: 2025-11-23 21:10:05.401 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:10:05 np0005532763 nova_compute[231311]: 2025-11-23 21:10:05.402 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:10:05 np0005532763 nova_compute[231311]: 2025-11-23 21:10:05.402 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:10:05 np0005532763 nova_compute[231311]: 2025-11-23 21:10:05.403 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:10:05 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:10:05 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2056082769' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:10:05 np0005532763 nova_compute[231311]: 2025-11-23 21:10:05.884 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:10:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:06 np0005532763 nova_compute[231311]: 2025-11-23 21:10:06.124 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:10:06 np0005532763 nova_compute[231311]: 2025-11-23 21:10:06.127 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4928MB free_disk=59.93894577026367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:10:06 np0005532763 nova_compute[231311]: 2025-11-23 21:10:06.127 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:10:06 np0005532763 nova_compute[231311]: 2025-11-23 21:10:06.128 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:10:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:06.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:06 np0005532763 nova_compute[231311]: 2025-11-23 21:10:06.187 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Instance 28677820-c1a2-4bbc-91d4-f2d7448eee33 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 23 16:10:06 np0005532763 nova_compute[231311]: 2025-11-23 21:10:06.188 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:10:06 np0005532763 nova_compute[231311]: 2025-11-23 21:10:06.188 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:10:06 np0005532763 nova_compute[231311]: 2025-11-23 21:10:06.260 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:06 np0005532763 nova_compute[231311]: 2025-11-23 21:10:06.284 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:10:06 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:10:06 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2133734318' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:10:06 np0005532763 nova_compute[231311]: 2025-11-23 21:10:06.728 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:10:06 np0005532763 nova_compute[231311]: 2025-11-23 21:10:06.738 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:10:06 np0005532763 nova_compute[231311]: 2025-11-23 21:10:06.755 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:10:06 np0005532763 nova_compute[231311]: 2025-11-23 21:10:06.781 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:10:06 np0005532763 nova_compute[231311]: 2025-11-23 21:10:06.781 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.654s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:10:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:06.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:07 np0005532763 nova_compute[231311]: 2025-11-23 21:10:07.545 231315 DEBUG nova.network.neutron [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Successfully created port: 3b1f1868-aa98-47ef-be0e-37dd0abbfa44 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 23 16:10:07 np0005532763 nova_compute[231311]: 2025-11-23 21:10:07.643 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:07 np0005532763 nova_compute[231311]: 2025-11-23 21:10:07.782 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:10:07 np0005532763 nova_compute[231311]: 2025-11-23 21:10:07.783 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:10:07 np0005532763 nova_compute[231311]: 2025-11-23 21:10:07.783 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:10:07 np0005532763 nova_compute[231311]: 2025-11-23 21:10:07.783 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:10:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:10:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1797927620' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:10:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:10:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1797927620' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:10:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:08.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:08.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:10:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:10:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:10:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:09 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:10:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:09 np0005532763 nova_compute[231311]: 2025-11-23 21:10:09.185 231315 DEBUG nova.network.neutron [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Successfully updated port: 3b1f1868-aa98-47ef-be0e-37dd0abbfa44 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 23 16:10:09 np0005532763 nova_compute[231311]: 2025-11-23 21:10:09.196 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "refresh_cache-28677820-c1a2-4bbc-91d4-f2d7448eee33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:10:09 np0005532763 nova_compute[231311]: 2025-11-23 21:10:09.197 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquired lock "refresh_cache-28677820-c1a2-4bbc-91d4-f2d7448eee33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:10:09 np0005532763 nova_compute[231311]: 2025-11-23 21:10:09.197 231315 DEBUG nova.network.neutron [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 23 16:10:09 np0005532763 nova_compute[231311]: 2025-11-23 21:10:09.269 231315 DEBUG nova.compute.manager [req-28622c9c-f9de-474e-aa03-337f56a349a8 req-d0b0756e-3184-4a25-a822-bd93af561a7c 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Received event network-changed-3b1f1868-aa98-47ef-be0e-37dd0abbfa44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:10:09 np0005532763 nova_compute[231311]: 2025-11-23 21:10:09.270 231315 DEBUG nova.compute.manager [req-28622c9c-f9de-474e-aa03-337f56a349a8 req-d0b0756e-3184-4a25-a822-bd93af561a7c 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Refreshing instance network info cache due to event network-changed-3b1f1868-aa98-47ef-be0e-37dd0abbfa44. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 23 16:10:09 np0005532763 nova_compute[231311]: 2025-11-23 21:10:09.270 231315 DEBUG oslo_concurrency.lockutils [req-28622c9c-f9de-474e-aa03-337f56a349a8 req-d0b0756e-3184-4a25-a822-bd93af561a7c 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "refresh_cache-28677820-c1a2-4bbc-91d4-f2d7448eee33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:10:09 np0005532763 nova_compute[231311]: 2025-11-23 21:10:09.516 231315 DEBUG nova.network.neutron [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 23 16:10:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:10:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:10.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.404 231315 DEBUG nova.network.neutron [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Updating instance_info_cache with network_info: [{"id": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "address": "fa:16:3e:37:fc:98", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1f1868-aa", "ovs_interfaceid": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.423 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Releasing lock "refresh_cache-28677820-c1a2-4bbc-91d4-f2d7448eee33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.424 231315 DEBUG nova.compute.manager [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Instance network_info: |[{"id": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "address": "fa:16:3e:37:fc:98", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1f1868-aa", "ovs_interfaceid": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.424 231315 DEBUG oslo_concurrency.lockutils [req-28622c9c-f9de-474e-aa03-337f56a349a8 req-d0b0756e-3184-4a25-a822-bd93af561a7c 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquired lock "refresh_cache-28677820-c1a2-4bbc-91d4-f2d7448eee33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.424 231315 DEBUG nova.network.neutron [req-28622c9c-f9de-474e-aa03-337f56a349a8 req-d0b0756e-3184-4a25-a822-bd93af561a7c 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Refreshing network info cache for port 3b1f1868-aa98-47ef-be0e-37dd0abbfa44 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.426 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Start _get_guest_xml network_info=[{"id": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "address": "fa:16:3e:37:fc:98", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1f1868-aa", "ovs_interfaceid": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-23T21:05:43Z,direct_url=<?>,disk_format='qcow2',id=3c45fa6c-8a99-4359-a34e-d89f4e1e77d0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3f8fb5175f85402ba20cf9c6989d47cf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-23T21:05:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'encryption_options': None, 'size': 0, 'image_id': '3c45fa6c-8a99-4359-a34e-d89f4e1e77d0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.431 231315 WARNING nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.439 231315 DEBUG nova.virt.libvirt.host [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.440 231315 DEBUG nova.virt.libvirt.host [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.442 231315 DEBUG nova.virt.libvirt.host [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.442 231315 DEBUG nova.virt.libvirt.host [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.443 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.443 231315 DEBUG nova.virt.hardware [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-23T21:05:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='56044b93-2979-48aa-b67f-c37e1b489306',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-23T21:05:43Z,direct_url=<?>,disk_format='qcow2',id=3c45fa6c-8a99-4359-a34e-d89f4e1e77d0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3f8fb5175f85402ba20cf9c6989d47cf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-23T21:05:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.443 231315 DEBUG nova.virt.hardware [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.444 231315 DEBUG nova.virt.hardware [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.444 231315 DEBUG nova.virt.hardware [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.444 231315 DEBUG nova.virt.hardware [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.444 231315 DEBUG nova.virt.hardware [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.444 231315 DEBUG nova.virt.hardware [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.445 231315 DEBUG nova.virt.hardware [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.445 231315 DEBUG nova.virt.hardware [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.445 231315 DEBUG nova.virt.hardware [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.445 231315 DEBUG nova.virt.hardware [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.448 231315 DEBUG oslo_concurrency.processutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:10:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 16:10:10 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1538673241' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.884 231315 DEBUG oslo_concurrency.processutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.915 231315 DEBUG nova.storage.rbd_utils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 28677820-c1a2-4bbc-91d4-f2d7448eee33_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:10:10 np0005532763 nova_compute[231311]: 2025-11-23 21:10:10.919 231315 DEBUG oslo_concurrency.processutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:10:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:10.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.262 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:11 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 16:10:11 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/132932692' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.380 231315 DEBUG oslo_concurrency.processutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.381 231315 DEBUG nova.virt.libvirt.vif [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T21:10:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1871965757',display_name='tempest-TestNetworkBasicOps-server-1871965757',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1871965757',id=5,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNOAT9k3zGchBzWDN79+VgyhGUt8IeduAM1JHy7qxGVq4N9VQlTdW7nc+YKCPOnHiFUD9FwEiDOYQcEe6RaTaPalAn8aDudsWDDereIsBSXehGmgLe2qqOH5/26yedYazw==',key_name='tempest-TestNetworkBasicOps-1070907508',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-zim0hrvb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-23T21:10:04Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=28677820-c1a2-4bbc-91d4-f2d7448eee33,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "address": "fa:16:3e:37:fc:98", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1f1868-aa", "ovs_interfaceid": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.381 231315 DEBUG nova.network.os_vif_util [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "address": "fa:16:3e:37:fc:98", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1f1868-aa", "ovs_interfaceid": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.382 231315 DEBUG nova.network.os_vif_util [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:fc:98,bridge_name='br-int',has_traffic_filtering=True,id=3b1f1868-aa98-47ef-be0e-37dd0abbfa44,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1f1868-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.383 231315 DEBUG nova.objects.instance [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'pci_devices' on Instance uuid 28677820-c1a2-4bbc-91d4-f2d7448eee33 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.420 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] End _get_guest_xml xml=<domain type="kvm">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  <uuid>28677820-c1a2-4bbc-91d4-f2d7448eee33</uuid>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  <name>instance-00000005</name>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  <memory>131072</memory>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  <vcpu>1</vcpu>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  <metadata>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <nova:name>tempest-TestNetworkBasicOps-server-1871965757</nova:name>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <nova:creationTime>2025-11-23 21:10:10</nova:creationTime>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <nova:flavor name="m1.nano">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        <nova:memory>128</nova:memory>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        <nova:disk>1</nova:disk>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        <nova:swap>0</nova:swap>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        <nova:ephemeral>0</nova:ephemeral>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        <nova:vcpus>1</nova:vcpus>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      </nova:flavor>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <nova:owner>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        <nova:user uuid="9fb5352c62684f2ba3a326a953a10dfe">tempest-TestNetworkBasicOps-1975357669-project-member</nova:user>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        <nova:project uuid="782593db60784ab8bff41fe87d72ff5f">tempest-TestNetworkBasicOps-1975357669</nova:project>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      </nova:owner>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <nova:root type="image" uuid="3c45fa6c-8a99-4359-a34e-d89f4e1e77d0"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <nova:ports>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        <nova:port uuid="3b1f1868-aa98-47ef-be0e-37dd0abbfa44">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        </nova:port>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      </nova:ports>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    </nova:instance>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  </metadata>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  <sysinfo type="smbios">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <system>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <entry name="manufacturer">RDO</entry>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <entry name="product">OpenStack Compute</entry>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <entry name="serial">28677820-c1a2-4bbc-91d4-f2d7448eee33</entry>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <entry name="uuid">28677820-c1a2-4bbc-91d4-f2d7448eee33</entry>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <entry name="family">Virtual Machine</entry>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    </system>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  </sysinfo>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  <os>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <boot dev="hd"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <smbios mode="sysinfo"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  </os>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  <features>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <acpi/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <apic/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <vmcoreinfo/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  </features>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  <clock offset="utc">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <timer name="pit" tickpolicy="delay"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <timer name="hpet" present="no"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  </clock>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  <cpu mode="host-model" match="exact">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <topology sockets="1" cores="1" threads="1"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  </cpu>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  <devices>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <disk type="network" device="disk">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <driver type="raw" cache="none"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <source protocol="rbd" name="vms/28677820-c1a2-4bbc-91d4-f2d7448eee33_disk">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        <host name="192.168.122.100" port="6789"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        <host name="192.168.122.102" port="6789"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        <host name="192.168.122.101" port="6789"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      </source>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <auth username="openstack">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        <secret type="ceph" uuid="03808be8-ae4a-5548-82e6-4a294f1bc627"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      </auth>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <target dev="vda" bus="virtio"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    </disk>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <disk type="network" device="cdrom">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <driver type="raw" cache="none"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <source protocol="rbd" name="vms/28677820-c1a2-4bbc-91d4-f2d7448eee33_disk.config">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        <host name="192.168.122.100" port="6789"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        <host name="192.168.122.102" port="6789"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        <host name="192.168.122.101" port="6789"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      </source>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <auth username="openstack">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:        <secret type="ceph" uuid="03808be8-ae4a-5548-82e6-4a294f1bc627"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      </auth>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <target dev="sda" bus="sata"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    </disk>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <interface type="ethernet">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <mac address="fa:16:3e:37:fc:98"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <model type="virtio"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <driver name="vhost" rx_queue_size="512"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <mtu size="1442"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <target dev="tap3b1f1868-aa"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    </interface>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <serial type="pty">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <log file="/var/lib/nova/instances/28677820-c1a2-4bbc-91d4-f2d7448eee33/console.log" append="off"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    </serial>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <video>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <model type="virtio"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    </video>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <input type="tablet" bus="usb"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <rng model="virtio">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <backend model="random">/dev/urandom</backend>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    </rng>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <controller type="usb" index="0"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    <memballoon model="virtio">
Nov 23 16:10:11 np0005532763 nova_compute[231311]:      <stats period="10"/>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:    </memballoon>
Nov 23 16:10:11 np0005532763 nova_compute[231311]:  </devices>
Nov 23 16:10:11 np0005532763 nova_compute[231311]: </domain>
Nov 23 16:10:11 np0005532763 nova_compute[231311]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.421 231315 DEBUG nova.compute.manager [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Preparing to wait for external event network-vif-plugged-3b1f1868-aa98-47ef-be0e-37dd0abbfa44 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.422 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.422 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.422 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.423 231315 DEBUG nova.virt.libvirt.vif [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T21:10:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1871965757',display_name='tempest-TestNetworkBasicOps-server-1871965757',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1871965757',id=5,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNOAT9k3zGchBzWDN79+VgyhGUt8IeduAM1JHy7qxGVq4N9VQlTdW7nc+YKCPOnHiFUD9FwEiDOYQcEe6RaTaPalAn8aDudsWDDereIsBSXehGmgLe2qqOH5/26yedYazw==',key_name='tempest-TestNetworkBasicOps-1070907508',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-zim0hrvb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-23T21:10:04Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=28677820-c1a2-4bbc-91d4-f2d7448eee33,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "address": "fa:16:3e:37:fc:98", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1f1868-aa", "ovs_interfaceid": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.423 231315 DEBUG nova.network.os_vif_util [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "address": "fa:16:3e:37:fc:98", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1f1868-aa", "ovs_interfaceid": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.424 231315 DEBUG nova.network.os_vif_util [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:fc:98,bridge_name='br-int',has_traffic_filtering=True,id=3b1f1868-aa98-47ef-be0e-37dd0abbfa44,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1f1868-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.424 231315 DEBUG os_vif [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:fc:98,bridge_name='br-int',has_traffic_filtering=True,id=3b1f1868-aa98-47ef-be0e-37dd0abbfa44,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1f1868-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.424 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.425 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.425 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.429 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.430 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3b1f1868-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.431 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3b1f1868-aa, col_values=(('external_ids', {'iface-id': '3b1f1868-aa98-47ef-be0e-37dd0abbfa44', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:37:fc:98', 'vm-uuid': '28677820-c1a2-4bbc-91d4-f2d7448eee33'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.434 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:11 np0005532763 NetworkManager[48849]: <info>  [1763932211.4377] manager: (tap3b1f1868-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.438 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.446 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.447 231315 INFO os_vif [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:fc:98,bridge_name='br-int',has_traffic_filtering=True,id=3b1f1868-aa98-47ef-be0e-37dd0abbfa44,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1f1868-aa')#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.499 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.499 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.500 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No VIF found with MAC fa:16:3e:37:fc:98, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.501 231315 INFO nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Using config drive#033[00m
Nov 23 16:10:11 np0005532763 nova_compute[231311]: 2025-11-23 21:10:11.544 231315 DEBUG nova.storage.rbd_utils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 28677820-c1a2-4bbc-91d4-f2d7448eee33_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:10:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:12 np0005532763 nova_compute[231311]: 2025-11-23 21:10:12.151 231315 INFO nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Creating config drive at /var/lib/nova/instances/28677820-c1a2-4bbc-91d4-f2d7448eee33/disk.config#033[00m
Nov 23 16:10:12 np0005532763 nova_compute[231311]: 2025-11-23 21:10:12.158 231315 DEBUG oslo_concurrency.processutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/28677820-c1a2-4bbc-91d4-f2d7448eee33/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppcb3dxfs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:10:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:12.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:12 np0005532763 nova_compute[231311]: 2025-11-23 21:10:12.303 231315 DEBUG oslo_concurrency.processutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/28677820-c1a2-4bbc-91d4-f2d7448eee33/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppcb3dxfs" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:10:12 np0005532763 nova_compute[231311]: 2025-11-23 21:10:12.355 231315 DEBUG nova.storage.rbd_utils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 28677820-c1a2-4bbc-91d4-f2d7448eee33_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:10:12 np0005532763 nova_compute[231311]: 2025-11-23 21:10:12.361 231315 DEBUG oslo_concurrency.processutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/28677820-c1a2-4bbc-91d4-f2d7448eee33/disk.config 28677820-c1a2-4bbc-91d4-f2d7448eee33_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:10:12 np0005532763 nova_compute[231311]: 2025-11-23 21:10:12.430 231315 DEBUG nova.network.neutron [req-28622c9c-f9de-474e-aa03-337f56a349a8 req-d0b0756e-3184-4a25-a822-bd93af561a7c 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Updated VIF entry in instance network info cache for port 3b1f1868-aa98-47ef-be0e-37dd0abbfa44. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 23 16:10:12 np0005532763 nova_compute[231311]: 2025-11-23 21:10:12.431 231315 DEBUG nova.network.neutron [req-28622c9c-f9de-474e-aa03-337f56a349a8 req-d0b0756e-3184-4a25-a822-bd93af561a7c 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Updating instance_info_cache with network_info: [{"id": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "address": "fa:16:3e:37:fc:98", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1f1868-aa", "ovs_interfaceid": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:10:12 np0005532763 nova_compute[231311]: 2025-11-23 21:10:12.448 231315 DEBUG oslo_concurrency.lockutils [req-28622c9c-f9de-474e-aa03-337f56a349a8 req-d0b0756e-3184-4a25-a822-bd93af561a7c 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Releasing lock "refresh_cache-28677820-c1a2-4bbc-91d4-f2d7448eee33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:10:12 np0005532763 nova_compute[231311]: 2025-11-23 21:10:12.575 231315 DEBUG oslo_concurrency.processutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/28677820-c1a2-4bbc-91d4-f2d7448eee33/disk.config 28677820-c1a2-4bbc-91d4-f2d7448eee33_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.214s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:10:12 np0005532763 nova_compute[231311]: 2025-11-23 21:10:12.576 231315 INFO nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Deleting local config drive /var/lib/nova/instances/28677820-c1a2-4bbc-91d4-f2d7448eee33/disk.config because it was imported into RBD.#033[00m
Nov 23 16:10:12 np0005532763 systemd[1]: Starting libvirt secret daemon...
Nov 23 16:10:12 np0005532763 systemd[1]: Started libvirt secret daemon.
Nov 23 16:10:12 np0005532763 kernel: tap3b1f1868-aa: entered promiscuous mode
Nov 23 16:10:12 np0005532763 NetworkManager[48849]: <info>  [1763932212.7286] manager: (tap3b1f1868-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Nov 23 16:10:12 np0005532763 ovn_controller[133425]: 2025-11-23T21:10:12Z|00037|binding|INFO|Claiming lport 3b1f1868-aa98-47ef-be0e-37dd0abbfa44 for this chassis.
Nov 23 16:10:12 np0005532763 ovn_controller[133425]: 2025-11-23T21:10:12Z|00038|binding|INFO|3b1f1868-aa98-47ef-be0e-37dd0abbfa44: Claiming fa:16:3e:37:fc:98 10.100.0.12
Nov 23 16:10:12 np0005532763 nova_compute[231311]: 2025-11-23 21:10:12.730 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:12 np0005532763 nova_compute[231311]: 2025-11-23 21:10:12.739 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:12 np0005532763 nova_compute[231311]: 2025-11-23 21:10:12.745 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:12 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:12.757 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:fc:98 10.100.0.12'], port_security=['fa:16:3e:37:fc:98 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '28677820-c1a2-4bbc-91d4-f2d7448eee33', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ff6a2ba-50a1-444b-9685-151db9bcac89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782593db60784ab8bff41fe87d72ff5f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2a6c1b69-209a-4704-854e-f7cfc81d8441', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c0604ff-606a-413a-88a2-c316eba90e56, chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], logical_port=3b1f1868-aa98-47ef-be0e-37dd0abbfa44) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:10:12 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:12.759 142920 INFO neutron.agent.ovn.metadata.agent [-] Port 3b1f1868-aa98-47ef-be0e-37dd0abbfa44 in datapath 6ff6a2ba-50a1-444b-9685-151db9bcac89 bound to our chassis#033[00m
Nov 23 16:10:12 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:12.761 142920 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6ff6a2ba-50a1-444b-9685-151db9bcac89#033[00m
Nov 23 16:10:12 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:12.779 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[c577e9a1-75f5-4919-b4e7-e6fab5b12d4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:12 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:12.781 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6ff6a2ba-51 in ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 23 16:10:12 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:12.783 235389 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6ff6a2ba-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 23 16:10:12 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:12.783 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[62654736-40b4-4996-87ea-84127993c7f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:12 np0005532763 systemd-machined[194484]: New machine qemu-3-instance-00000005.
Nov 23 16:10:12 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:12.784 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[42f24615-cd34-402e-8963-becf6a7a9798]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:12 np0005532763 systemd-udevd[237579]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 16:10:12 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:12.805 143034 DEBUG oslo.privsep.daemon [-] privsep: reply[4394ffb4-291c-40b7-82a8-f811368ab134]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:12 np0005532763 NetworkManager[48849]: <info>  [1763932212.8166] device (tap3b1f1868-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 23 16:10:12 np0005532763 NetworkManager[48849]: <info>  [1763932212.8184] device (tap3b1f1868-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 23 16:10:12 np0005532763 systemd[1]: Started Virtual Machine qemu-3-instance-00000005.
Nov 23 16:10:12 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:12.833 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[73dba74d-76ad-4b2c-a0c6-26849995fd26]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:12 np0005532763 nova_compute[231311]: 2025-11-23 21:10:12.833 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:12 np0005532763 ovn_controller[133425]: 2025-11-23T21:10:12Z|00039|binding|INFO|Setting lport 3b1f1868-aa98-47ef-be0e-37dd0abbfa44 ovn-installed in OVS
Nov 23 16:10:12 np0005532763 ovn_controller[133425]: 2025-11-23T21:10:12Z|00040|binding|INFO|Setting lport 3b1f1868-aa98-47ef-be0e-37dd0abbfa44 up in Southbound
Nov 23 16:10:12 np0005532763 nova_compute[231311]: 2025-11-23 21:10:12.839 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:12 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:12.874 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[52c08f43-778e-4b3f-99a0-58189cde6706]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:12 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:12.884 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[7c0dd0f4-c789-40c5-a8ab-b9e8c59b3c16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:12 np0005532763 NetworkManager[48849]: <info>  [1763932212.8868] manager: (tap6ff6a2ba-50): new Veth device (/org/freedesktop/NetworkManager/Devices/34)
Nov 23 16:10:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:12 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:12.932 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[43a8c6d9-119d-4fbe-ae8e-b475b95359a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:12 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:12.937 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[942ea94c-83e4-43ba-9b96-fd45383cd29f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:12.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:12 np0005532763 NetworkManager[48849]: <info>  [1763932212.9716] device (tap6ff6a2ba-50): carrier: link connected
Nov 23 16:10:12 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:12.979 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[511511f8-d226-47f6-adbc-d4d9238593ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:13.006 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[e1cb730c-e0a8-4f20-9705-aee48beec2ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6ff6a2ba-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:e0:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416251, 'reachable_time': 40204, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237611, 'error': None, 'target': 'ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:13.030 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[949d2513-d545-456d-9bc9-5da334b7bc05]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:e098'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 416251, 'tstamp': 416251}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237613, 'error': None, 'target': 'ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.046 231315 DEBUG nova.compute.manager [req-ccee6455-534c-4031-9fb8-8ddc7107974a req-0a39e9cd-5a0d-4d3e-982a-4e37ba327f53 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Received event network-vif-plugged-3b1f1868-aa98-47ef-be0e-37dd0abbfa44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.046 231315 DEBUG oslo_concurrency.lockutils [req-ccee6455-534c-4031-9fb8-8ddc7107974a req-0a39e9cd-5a0d-4d3e-982a-4e37ba327f53 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.047 231315 DEBUG oslo_concurrency.lockutils [req-ccee6455-534c-4031-9fb8-8ddc7107974a req-0a39e9cd-5a0d-4d3e-982a-4e37ba327f53 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.047 231315 DEBUG oslo_concurrency.lockutils [req-ccee6455-534c-4031-9fb8-8ddc7107974a req-0a39e9cd-5a0d-4d3e-982a-4e37ba327f53 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.048 231315 DEBUG nova.compute.manager [req-ccee6455-534c-4031-9fb8-8ddc7107974a req-0a39e9cd-5a0d-4d3e-982a-4e37ba327f53 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Processing event network-vif-plugged-3b1f1868-aa98-47ef-be0e-37dd0abbfa44 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:13.057 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[d9d13aba-7078-4e63-be4a-9e30ebf0b9b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6ff6a2ba-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:e0:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416251, 'reachable_time': 40204, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237614, 'error': None, 'target': 'ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:13.103 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[1d5603e8-7316-4859-bd7f-f65eddf97a07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:13.188 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[94561926-8f3b-4c91-ba50-0fcf1db3061e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:13.190 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ff6a2ba-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:13.190 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:13.191 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6ff6a2ba-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:10:13 np0005532763 NetworkManager[48849]: <info>  [1763932213.1949] manager: (tap6ff6a2ba-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Nov 23 16:10:13 np0005532763 kernel: tap6ff6a2ba-50: entered promiscuous mode
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:13.198 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6ff6a2ba-50, col_values=(('external_ids', {'iface-id': '4bff4598-93d2-442e-90fe-19336d84eb93'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:10:13 np0005532763 ovn_controller[133425]: 2025-11-23T21:10:13Z|00041|binding|INFO|Releasing lport 4bff4598-93d2-442e-90fe-19336d84eb93 from this chassis (sb_readonly=0)
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.212 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.231 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.233 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:13.234 142920 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6ff6a2ba-50a1-444b-9685-151db9bcac89.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6ff6a2ba-50a1-444b-9685-151db9bcac89.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:13.235 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[9a69a8a6-cea6-42ea-82d3-698594d701c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:13.236 142920 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: global
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    log         /dev/log local0 debug
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    log-tag     haproxy-metadata-proxy-6ff6a2ba-50a1-444b-9685-151db9bcac89
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    user        root
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    group       root
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    maxconn     1024
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    pidfile     /var/lib/neutron/external/pids/6ff6a2ba-50a1-444b-9685-151db9bcac89.pid.haproxy
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    daemon
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: 
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: defaults
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    log global
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    mode http
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    option httplog
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    option dontlognull
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    option http-server-close
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    option forwardfor
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    retries                 3
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    timeout http-request    30s
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    timeout connect         30s
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    timeout client          32s
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    timeout server          32s
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    timeout http-keep-alive 30s
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: 
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: 
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: listen listener
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    bind 169.254.169.254:80
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    server metadata /var/lib/neutron/metadata_proxy
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]:    http-request add-header X-OVN-Network-ID 6ff6a2ba-50a1-444b-9685-151db9bcac89
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 23 16:10:13 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:13.237 142920 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89', 'env', 'PROCESS_TAG=haproxy-6ff6a2ba-50a1-444b-9685-151db9bcac89', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6ff6a2ba-50a1-444b-9685-151db9bcac89.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.667 231315 DEBUG nova.virt.driver [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Emitting event <LifecycleEvent: 1763932213.6660368, 28677820-c1a2-4bbc-91d4-f2d7448eee33 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.668 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] VM Started (Lifecycle Event)#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.671 231315 DEBUG nova.compute.manager [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.675 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.680 231315 INFO nova.virt.libvirt.driver [-] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Instance spawned successfully.#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.681 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.685 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:10:13 np0005532763 podman[237687]: 2025-11-23 21:10:13.68705877 +0000 UTC m=+0.070160074 container create 900801f03e52d761ace442cbe045a536db8fde27c3d64243dd76671793d51b69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.693 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.708 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.709 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.710 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.710 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.711 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.712 231315 DEBUG nova.virt.libvirt.driver [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.718 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.719 231315 DEBUG nova.virt.driver [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Emitting event <LifecycleEvent: 1763932213.6661994, 28677820-c1a2-4bbc-91d4-f2d7448eee33 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.720 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] VM Paused (Lifecycle Event)#033[00m
Nov 23 16:10:13 np0005532763 systemd[1]: Started libpod-conmon-900801f03e52d761ace442cbe045a536db8fde27c3d64243dd76671793d51b69.scope.
Nov 23 16:10:13 np0005532763 podman[237687]: 2025-11-23 21:10:13.647349275 +0000 UTC m=+0.030450589 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 23 16:10:13 np0005532763 systemd[1]: Started libcrun container.
Nov 23 16:10:13 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a0c05bfe8f205e497eafd825a209b2579d197f5af97e2c414183f269a8089273/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 16:10:13 np0005532763 podman[237687]: 2025-11-23 21:10:13.778359641 +0000 UTC m=+0.161460935 container init 900801f03e52d761ace442cbe045a536db8fde27c3d64243dd76671793d51b69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:10:13 np0005532763 podman[237687]: 2025-11-23 21:10:13.784213494 +0000 UTC m=+0.167314778 container start 900801f03e52d761ace442cbe045a536db8fde27c3d64243dd76671793d51b69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.801 231315 INFO nova.compute.manager [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Took 9.49 seconds to spawn the instance on the hypervisor.#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.802 231315 DEBUG nova.compute.manager [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.809 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:10:13 np0005532763 neutron-haproxy-ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89[237704]: [NOTICE]   (237708) : New worker (237710) forked
Nov 23 16:10:13 np0005532763 neutron-haproxy-ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89[237704]: [NOTICE]   (237708) : Loading success.
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.821 231315 DEBUG nova.virt.driver [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Emitting event <LifecycleEvent: 1763932213.6744833, 28677820-c1a2-4bbc-91d4-f2d7448eee33 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.821 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] VM Resumed (Lifecycle Event)#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.848 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.854 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.881 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.895 231315 INFO nova.compute.manager [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Took 10.45 seconds to build instance.#033[00m
Nov 23 16:10:13 np0005532763 nova_compute[231311]: 2025-11-23 21:10:13.909 231315 DEBUG oslo_concurrency.lockutils [None req-d5f153c1-56f4-4ebb-9278-c931499773b9 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "28677820-c1a2-4bbc-91d4-f2d7448eee33" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.528s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:10:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:13 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:10:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:13 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:10:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:13 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:10:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:14 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:10:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:14.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:10:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:10:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:14.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:10:15 np0005532763 nova_compute[231311]: 2025-11-23 21:10:15.134 231315 DEBUG nova.compute.manager [req-0b590e16-7e43-4bfc-9cb2-9c1c0e9912f1 req-2d7cdfc9-05b9-454c-811f-8031c1609e10 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Received event network-vif-plugged-3b1f1868-aa98-47ef-be0e-37dd0abbfa44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:10:15 np0005532763 nova_compute[231311]: 2025-11-23 21:10:15.135 231315 DEBUG oslo_concurrency.lockutils [req-0b590e16-7e43-4bfc-9cb2-9c1c0e9912f1 req-2d7cdfc9-05b9-454c-811f-8031c1609e10 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:10:15 np0005532763 nova_compute[231311]: 2025-11-23 21:10:15.135 231315 DEBUG oslo_concurrency.lockutils [req-0b590e16-7e43-4bfc-9cb2-9c1c0e9912f1 req-2d7cdfc9-05b9-454c-811f-8031c1609e10 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:10:15 np0005532763 nova_compute[231311]: 2025-11-23 21:10:15.136 231315 DEBUG oslo_concurrency.lockutils [req-0b590e16-7e43-4bfc-9cb2-9c1c0e9912f1 req-2d7cdfc9-05b9-454c-811f-8031c1609e10 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:10:15 np0005532763 nova_compute[231311]: 2025-11-23 21:10:15.136 231315 DEBUG nova.compute.manager [req-0b590e16-7e43-4bfc-9cb2-9c1c0e9912f1 req-2d7cdfc9-05b9-454c-811f-8031c1609e10 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] No waiting events found dispatching network-vif-plugged-3b1f1868-aa98-47ef-be0e-37dd0abbfa44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:10:15 np0005532763 nova_compute[231311]: 2025-11-23 21:10:15.136 231315 WARNING nova.compute.manager [req-0b590e16-7e43-4bfc-9cb2-9c1c0e9912f1 req-2d7cdfc9-05b9-454c-811f-8031c1609e10 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Received unexpected event network-vif-plugged-3b1f1868-aa98-47ef-be0e-37dd0abbfa44 for instance with vm_state active and task_state None.#033[00m
Nov 23 16:10:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:15 np0005532763 podman[237720]: 2025-11-23 21:10:15.223809182 +0000 UTC m=+0.094751558 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 23 16:10:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:16.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:16 np0005532763 nova_compute[231311]: 2025-11-23 21:10:16.298 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:16 np0005532763 nova_compute[231311]: 2025-11-23 21:10:16.434 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:16.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:18 np0005532763 nova_compute[231311]: 2025-11-23 21:10:18.126 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:18 np0005532763 ovn_controller[133425]: 2025-11-23T21:10:18Z|00042|binding|INFO|Releasing lport 4bff4598-93d2-442e-90fe-19336d84eb93 from this chassis (sb_readonly=0)
Nov 23 16:10:18 np0005532763 NetworkManager[48849]: <info>  [1763932218.1290] manager: (patch-br-int-to-provnet-ce139dcc-0def-41ea-bc8f-4f8d9359e223): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Nov 23 16:10:18 np0005532763 NetworkManager[48849]: <info>  [1763932218.1303] manager: (patch-provnet-ce139dcc-0def-41ea-bc8f-4f8d9359e223-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Nov 23 16:10:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:18.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:18 np0005532763 ovn_controller[133425]: 2025-11-23T21:10:18Z|00043|binding|INFO|Releasing lport 4bff4598-93d2-442e-90fe-19336d84eb93 from this chassis (sb_readonly=0)
Nov 23 16:10:18 np0005532763 nova_compute[231311]: 2025-11-23 21:10:18.183 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:18 np0005532763 nova_compute[231311]: 2025-11-23 21:10:18.189 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:18 np0005532763 nova_compute[231311]: 2025-11-23 21:10:18.371 231315 DEBUG nova.compute.manager [req-80f222a0-2c1c-4d27-9b09-e2c7382d8952 req-e5ab41e1-ef4c-4083-97b6-d3c6fdc8e25b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Received event network-changed-3b1f1868-aa98-47ef-be0e-37dd0abbfa44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:10:18 np0005532763 nova_compute[231311]: 2025-11-23 21:10:18.372 231315 DEBUG nova.compute.manager [req-80f222a0-2c1c-4d27-9b09-e2c7382d8952 req-e5ab41e1-ef4c-4083-97b6-d3c6fdc8e25b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Refreshing instance network info cache due to event network-changed-3b1f1868-aa98-47ef-be0e-37dd0abbfa44. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 23 16:10:18 np0005532763 nova_compute[231311]: 2025-11-23 21:10:18.372 231315 DEBUG oslo_concurrency.lockutils [req-80f222a0-2c1c-4d27-9b09-e2c7382d8952 req-e5ab41e1-ef4c-4083-97b6-d3c6fdc8e25b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "refresh_cache-28677820-c1a2-4bbc-91d4-f2d7448eee33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:10:18 np0005532763 nova_compute[231311]: 2025-11-23 21:10:18.373 231315 DEBUG oslo_concurrency.lockutils [req-80f222a0-2c1c-4d27-9b09-e2c7382d8952 req-e5ab41e1-ef4c-4083-97b6-d3c6fdc8e25b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquired lock "refresh_cache-28677820-c1a2-4bbc-91d4-f2d7448eee33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:10:18 np0005532763 nova_compute[231311]: 2025-11-23 21:10:18.374 231315 DEBUG nova.network.neutron [req-80f222a0-2c1c-4d27-9b09-e2c7382d8952 req-e5ab41e1-ef4c-4083-97b6-d3c6fdc8e25b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Refreshing network info cache for port 3b1f1868-aa98-47ef-be0e-37dd0abbfa44 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 23 16:10:18 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:18.394 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3a:26:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:d5:4d:db:d5:2b'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:10:18 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:18.397 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 23 16:10:18 np0005532763 nova_compute[231311]: 2025-11-23 21:10:18.397 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:18.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:18 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:10:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:18 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:10:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:18 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:10:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:19 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:10:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:19 np0005532763 nova_compute[231311]: 2025-11-23 21:10:19.352 231315 DEBUG nova.network.neutron [req-80f222a0-2c1c-4d27-9b09-e2c7382d8952 req-e5ab41e1-ef4c-4083-97b6-d3c6fdc8e25b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Updated VIF entry in instance network info cache for port 3b1f1868-aa98-47ef-be0e-37dd0abbfa44. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 23 16:10:19 np0005532763 nova_compute[231311]: 2025-11-23 21:10:19.353 231315 DEBUG nova.network.neutron [req-80f222a0-2c1c-4d27-9b09-e2c7382d8952 req-e5ab41e1-ef4c-4083-97b6-d3c6fdc8e25b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Updating instance_info_cache with network_info: [{"id": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "address": "fa:16:3e:37:fc:98", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1f1868-aa", "ovs_interfaceid": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:10:19 np0005532763 nova_compute[231311]: 2025-11-23 21:10:19.375 231315 DEBUG oslo_concurrency.lockutils [req-80f222a0-2c1c-4d27-9b09-e2c7382d8952 req-e5ab41e1-ef4c-4083-97b6-d3c6fdc8e25b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Releasing lock "refresh_cache-28677820-c1a2-4bbc-91d4-f2d7448eee33" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:10:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:10:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:20.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:20.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:21 np0005532763 nova_compute[231311]: 2025-11-23 21:10:21.302 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:21 np0005532763 nova_compute[231311]: 2025-11-23 21:10:21.436 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:22.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:22.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:23 np0005532763 podman[237751]: 2025-11-23 21:10:23.270731721 +0000 UTC m=+0.134565717 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 16:10:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:23 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:10:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:23 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:10:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:23 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:10:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:24 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:10:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:24.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:10:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:24.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:26.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:26 np0005532763 podman[237804]: 2025-11-23 21:10:26.239676526 +0000 UTC m=+0.111156555 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 16:10:26 np0005532763 nova_compute[231311]: 2025-11-23 21:10:26.350 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:26 np0005532763 nova_compute[231311]: 2025-11-23 21:10:26.438 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:26 np0005532763 ovn_controller[133425]: 2025-11-23T21:10:26Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:37:fc:98 10.100.0.12
Nov 23 16:10:26 np0005532763 ovn_controller[133425]: 2025-11-23T21:10:26Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:37:fc:98 10.100.0.12
Nov 23 16:10:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:26.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:28.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:28 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:28.400 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10e3bf57-dd2d-4b94-851f-925bcd297dde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:10:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:28.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:28 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:10:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:28 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:10:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:28 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:10:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:29 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:10:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:10:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000055s ======
Nov 23 16:10:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:30.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000055s
Nov 23 16:10:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:30.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:31 np0005532763 nova_compute[231311]: 2025-11-23 21:10:31.353 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:31 np0005532763 nova_compute[231311]: 2025-11-23 21:10:31.441 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:32.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:32 np0005532763 nova_compute[231311]: 2025-11-23 21:10:32.893 231315 INFO nova.compute.manager [None req-eb5fdd4e-e6a3-447b-9274-2547a00cbb3a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Get console output#033[00m
Nov 23 16:10:32 np0005532763 nova_compute[231311]: 2025-11-23 21:10:32.900 231315 INFO oslo.privsep.daemon [None req-eb5fdd4e-e6a3-447b-9274-2547a00cbb3a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp65j_zed8/privsep.sock']#033[00m
Nov 23 16:10:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:32.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:33 np0005532763 nova_compute[231311]: 2025-11-23 21:10:33.644 231315 INFO oslo.privsep.daemon [None req-eb5fdd4e-e6a3-447b-9274-2547a00cbb3a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Spawned new privsep daemon via rootwrap#033[00m
Nov 23 16:10:33 np0005532763 nova_compute[231311]: 2025-11-23 21:10:33.521 237838 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 23 16:10:33 np0005532763 nova_compute[231311]: 2025-11-23 21:10:33.527 237838 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 23 16:10:33 np0005532763 nova_compute[231311]: 2025-11-23 21:10:33.531 237838 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Nov 23 16:10:33 np0005532763 nova_compute[231311]: 2025-11-23 21:10:33.532 237838 INFO oslo.privsep.daemon [-] privsep daemon running as pid 237838#033[00m
Nov 23 16:10:33 np0005532763 nova_compute[231311]: 2025-11-23 21:10:33.749 237838 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 23 16:10:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:33 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:10:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:33 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:10:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:33 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:10:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:34 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:10:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.161 231315 DEBUG oslo_concurrency.lockutils [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "28677820-c1a2-4bbc-91d4-f2d7448eee33" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.162 231315 DEBUG oslo_concurrency.lockutils [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "28677820-c1a2-4bbc-91d4-f2d7448eee33" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.162 231315 DEBUG oslo_concurrency.lockutils [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.163 231315 DEBUG oslo_concurrency.lockutils [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.163 231315 DEBUG oslo_concurrency.lockutils [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.165 231315 INFO nova.compute.manager [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Terminating instance#033[00m
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.167 231315 DEBUG nova.compute.manager [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 23 16:10:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:34.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:34 np0005532763 kernel: tap3b1f1868-aa (unregistering): left promiscuous mode
Nov 23 16:10:34 np0005532763 NetworkManager[48849]: <info>  [1763932234.2352] device (tap3b1f1868-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 23 16:10:34 np0005532763 ovn_controller[133425]: 2025-11-23T21:10:34Z|00044|binding|INFO|Releasing lport 3b1f1868-aa98-47ef-be0e-37dd0abbfa44 from this chassis (sb_readonly=0)
Nov 23 16:10:34 np0005532763 ovn_controller[133425]: 2025-11-23T21:10:34Z|00045|binding|INFO|Setting lport 3b1f1868-aa98-47ef-be0e-37dd0abbfa44 down in Southbound
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.250 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:34 np0005532763 ovn_controller[133425]: 2025-11-23T21:10:34Z|00046|binding|INFO|Removing iface tap3b1f1868-aa ovn-installed in OVS
Nov 23 16:10:34 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:34.258 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:fc:98 10.100.0.12'], port_security=['fa:16:3e:37:fc:98 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '28677820-c1a2-4bbc-91d4-f2d7448eee33', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6ff6a2ba-50a1-444b-9685-151db9bcac89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782593db60784ab8bff41fe87d72ff5f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2a6c1b69-209a-4704-854e-f7cfc81d8441', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.196'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c0604ff-606a-413a-88a2-c316eba90e56, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], logical_port=3b1f1868-aa98-47ef-be0e-37dd0abbfa44) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:10:34 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:34.260 142920 INFO neutron.agent.ovn.metadata.agent [-] Port 3b1f1868-aa98-47ef-be0e-37dd0abbfa44 in datapath 6ff6a2ba-50a1-444b-9685-151db9bcac89 unbound from our chassis#033[00m
Nov 23 16:10:34 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:34.262 142920 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6ff6a2ba-50a1-444b-9685-151db9bcac89, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 23 16:10:34 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:34.265 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[0467e441-05e5-4c80-aa08-a2703f3bcac5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:34 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:34.265 142920 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89 namespace which is not needed anymore#033[00m
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.286 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:34 np0005532763 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Deactivated successfully.
Nov 23 16:10:34 np0005532763 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000005.scope: Consumed 13.868s CPU time.
Nov 23 16:10:34 np0005532763 systemd-machined[194484]: Machine qemu-3-instance-00000005 terminated.
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.407 231315 INFO nova.virt.libvirt.driver [-] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Instance destroyed successfully.#033[00m
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.407 231315 DEBUG nova.objects.instance [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'resources' on Instance uuid 28677820-c1a2-4bbc-91d4-f2d7448eee33 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.422 231315 DEBUG nova.virt.libvirt.vif [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-23T21:10:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1871965757',display_name='tempest-TestNetworkBasicOps-server-1871965757',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1871965757',id=5,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNOAT9k3zGchBzWDN79+VgyhGUt8IeduAM1JHy7qxGVq4N9VQlTdW7nc+YKCPOnHiFUD9FwEiDOYQcEe6RaTaPalAn8aDudsWDDereIsBSXehGmgLe2qqOH5/26yedYazw==',key_name='tempest-TestNetworkBasicOps-1070907508',keypairs=<?>,launch_index=0,launched_at=2025-11-23T21:10:13Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-zim0hrvb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-23T21:10:13Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=28677820-c1a2-4bbc-91d4-f2d7448eee33,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "address": "fa:16:3e:37:fc:98", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1f1868-aa", "ovs_interfaceid": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.423 231315 DEBUG nova.network.os_vif_util [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "address": "fa:16:3e:37:fc:98", "network": {"id": "6ff6a2ba-50a1-444b-9685-151db9bcac89", "bridge": "br-int", "label": "tempest-network-smoke--285822202", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3b1f1868-aa", "ovs_interfaceid": "3b1f1868-aa98-47ef-be0e-37dd0abbfa44", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.424 231315 DEBUG nova.network.os_vif_util [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:37:fc:98,bridge_name='br-int',has_traffic_filtering=True,id=3b1f1868-aa98-47ef-be0e-37dd0abbfa44,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1f1868-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.425 231315 DEBUG os_vif [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:fc:98,bridge_name='br-int',has_traffic_filtering=True,id=3b1f1868-aa98-47ef-be0e-37dd0abbfa44,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1f1868-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 23 16:10:34 np0005532763 neutron-haproxy-ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89[237704]: [NOTICE]   (237708) : haproxy version is 2.8.14-c23fe91
Nov 23 16:10:34 np0005532763 neutron-haproxy-ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89[237704]: [NOTICE]   (237708) : path to executable is /usr/sbin/haproxy
Nov 23 16:10:34 np0005532763 neutron-haproxy-ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89[237704]: [WARNING]  (237708) : Exiting Master process...
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.428 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.428 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3b1f1868-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:10:34 np0005532763 neutron-haproxy-ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89[237704]: [ALERT]    (237708) : Current worker (237710) exited with code 143 (Terminated)
Nov 23 16:10:34 np0005532763 neutron-haproxy-ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89[237704]: [WARNING]  (237708) : All workers exited. Exiting... (0)
Nov 23 16:10:34 np0005532763 systemd[1]: libpod-900801f03e52d761ace442cbe045a536db8fde27c3d64243dd76671793d51b69.scope: Deactivated successfully.
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.433 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.435 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:34 np0005532763 podman[237865]: 2025-11-23 21:10:34.439495435 +0000 UTC m=+0.066254856 container died 900801f03e52d761ace442cbe045a536db8fde27c3d64243dd76671793d51b69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.439 231315 INFO os_vif [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:fc:98,bridge_name='br-int',has_traffic_filtering=True,id=3b1f1868-aa98-47ef-be0e-37dd0abbfa44,network=Network(6ff6a2ba-50a1-444b-9685-151db9bcac89),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3b1f1868-aa')#033[00m
Nov 23 16:10:34 np0005532763 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-900801f03e52d761ace442cbe045a536db8fde27c3d64243dd76671793d51b69-userdata-shm.mount: Deactivated successfully.
Nov 23 16:10:34 np0005532763 systemd[1]: var-lib-containers-storage-overlay-a0c05bfe8f205e497eafd825a209b2579d197f5af97e2c414183f269a8089273-merged.mount: Deactivated successfully.
Nov 23 16:10:34 np0005532763 podman[237865]: 2025-11-23 21:10:34.497151341 +0000 UTC m=+0.123910802 container cleanup 900801f03e52d761ace442cbe045a536db8fde27c3d64243dd76671793d51b69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:10:34 np0005532763 systemd[1]: libpod-conmon-900801f03e52d761ace442cbe045a536db8fde27c3d64243dd76671793d51b69.scope: Deactivated successfully.
Nov 23 16:10:34 np0005532763 podman[237926]: 2025-11-23 21:10:34.60844273 +0000 UTC m=+0.071670817 container remove 900801f03e52d761ace442cbe045a536db8fde27c3d64243dd76671793d51b69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 16:10:34 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:34.616 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[5c1c74e9-78aa-418e-b30b-4ff8328304d1]: (4, ('Sun Nov 23 09:10:34 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89 (900801f03e52d761ace442cbe045a536db8fde27c3d64243dd76671793d51b69)\n900801f03e52d761ace442cbe045a536db8fde27c3d64243dd76671793d51b69\nSun Nov 23 09:10:34 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89 (900801f03e52d761ace442cbe045a536db8fde27c3d64243dd76671793d51b69)\n900801f03e52d761ace442cbe045a536db8fde27c3d64243dd76671793d51b69\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:34 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:34.619 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[d649189d-3421-430c-af3a-7f0781715e2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:34 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:34.621 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ff6a2ba-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.624 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:34 np0005532763 kernel: tap6ff6a2ba-50: left promiscuous mode
Nov 23 16:10:34 np0005532763 nova_compute[231311]: 2025-11-23 21:10:34.650 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:34 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:34.653 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[8a129acc-b407-44da-b8af-41db496bb526]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:34 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:34.670 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[8f0ebbfa-5719-40c7-bf85-9ead3b0ee34c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:34 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:34.672 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[61199344-f167-4b3c-848a-d3d82b436f8b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:34 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:34.697 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[3958946c-b9f6-4eec-adcd-521c40eaa1bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416240, 'reachable_time': 34263, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237942, 'error': None, 'target': 'ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:34 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:34.700 143034 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6ff6a2ba-50a1-444b-9685-151db9bcac89 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 23 16:10:34 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:34.700 143034 DEBUG oslo.privsep.daemon [-] privsep: reply[f89f2942-37e6-4ef8-9e01-919896819d9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:10:34 np0005532763 systemd[1]: run-netns-ovnmeta\x2d6ff6a2ba\x2d50a1\x2d444b\x2d9685\x2d151db9bcac89.mount: Deactivated successfully.
Nov 23 16:10:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:10:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:34.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:35 np0005532763 nova_compute[231311]: 2025-11-23 21:10:35.177 231315 INFO nova.virt.libvirt.driver [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Deleting instance files /var/lib/nova/instances/28677820-c1a2-4bbc-91d4-f2d7448eee33_del#033[00m
Nov 23 16:10:35 np0005532763 nova_compute[231311]: 2025-11-23 21:10:35.178 231315 INFO nova.virt.libvirt.driver [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Deletion of /var/lib/nova/instances/28677820-c1a2-4bbc-91d4-f2d7448eee33_del complete#033[00m
Nov 23 16:10:35 np0005532763 nova_compute[231311]: 2025-11-23 21:10:35.240 231315 INFO nova.compute.manager [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Took 1.07 seconds to destroy the instance on the hypervisor.#033[00m
Nov 23 16:10:35 np0005532763 nova_compute[231311]: 2025-11-23 21:10:35.241 231315 DEBUG oslo.service.loopingcall [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 23 16:10:35 np0005532763 nova_compute[231311]: 2025-11-23 21:10:35.241 231315 DEBUG nova.compute.manager [-] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 23 16:10:35 np0005532763 nova_compute[231311]: 2025-11-23 21:10:35.242 231315 DEBUG nova.network.neutron [-] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 23 16:10:35 np0005532763 nova_compute[231311]: 2025-11-23 21:10:35.339 231315 DEBUG nova.compute.manager [req-6045764c-0f7d-4202-8e32-cd169416bd2a req-6e26b601-7466-413e-820f-9ef62989b5bf 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Received event network-vif-unplugged-3b1f1868-aa98-47ef-be0e-37dd0abbfa44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:10:35 np0005532763 nova_compute[231311]: 2025-11-23 21:10:35.340 231315 DEBUG oslo_concurrency.lockutils [req-6045764c-0f7d-4202-8e32-cd169416bd2a req-6e26b601-7466-413e-820f-9ef62989b5bf 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:10:35 np0005532763 nova_compute[231311]: 2025-11-23 21:10:35.340 231315 DEBUG oslo_concurrency.lockutils [req-6045764c-0f7d-4202-8e32-cd169416bd2a req-6e26b601-7466-413e-820f-9ef62989b5bf 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:10:35 np0005532763 nova_compute[231311]: 2025-11-23 21:10:35.340 231315 DEBUG oslo_concurrency.lockutils [req-6045764c-0f7d-4202-8e32-cd169416bd2a req-6e26b601-7466-413e-820f-9ef62989b5bf 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:10:35 np0005532763 nova_compute[231311]: 2025-11-23 21:10:35.341 231315 DEBUG nova.compute.manager [req-6045764c-0f7d-4202-8e32-cd169416bd2a req-6e26b601-7466-413e-820f-9ef62989b5bf 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] No waiting events found dispatching network-vif-unplugged-3b1f1868-aa98-47ef-be0e-37dd0abbfa44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:10:35 np0005532763 nova_compute[231311]: 2025-11-23 21:10:35.341 231315 DEBUG nova.compute.manager [req-6045764c-0f7d-4202-8e32-cd169416bd2a req-6e26b601-7466-413e-820f-9ef62989b5bf 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Received event network-vif-unplugged-3b1f1868-aa98-47ef-be0e-37dd0abbfa44 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 23 16:10:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:36.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:36 np0005532763 nova_compute[231311]: 2025-11-23 21:10:36.399 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:36.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:37 np0005532763 nova_compute[231311]: 2025-11-23 21:10:37.314 231315 DEBUG nova.network.neutron [-] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:10:37 np0005532763 nova_compute[231311]: 2025-11-23 21:10:37.334 231315 INFO nova.compute.manager [-] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Took 2.09 seconds to deallocate network for instance.#033[00m
Nov 23 16:10:37 np0005532763 nova_compute[231311]: 2025-11-23 21:10:37.386 231315 DEBUG oslo_concurrency.lockutils [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:10:37 np0005532763 nova_compute[231311]: 2025-11-23 21:10:37.387 231315 DEBUG oslo_concurrency.lockutils [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:10:37 np0005532763 nova_compute[231311]: 2025-11-23 21:10:37.420 231315 DEBUG nova.compute.manager [req-fe6cba5b-5a1a-4e01-bd40-f70f31288223 req-cd220e39-27b6-4af6-aa3f-8c93bfb496d4 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Received event network-vif-plugged-3b1f1868-aa98-47ef-be0e-37dd0abbfa44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:10:37 np0005532763 nova_compute[231311]: 2025-11-23 21:10:37.421 231315 DEBUG oslo_concurrency.lockutils [req-fe6cba5b-5a1a-4e01-bd40-f70f31288223 req-cd220e39-27b6-4af6-aa3f-8c93bfb496d4 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:10:37 np0005532763 nova_compute[231311]: 2025-11-23 21:10:37.421 231315 DEBUG oslo_concurrency.lockutils [req-fe6cba5b-5a1a-4e01-bd40-f70f31288223 req-cd220e39-27b6-4af6-aa3f-8c93bfb496d4 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:10:37 np0005532763 nova_compute[231311]: 2025-11-23 21:10:37.422 231315 DEBUG oslo_concurrency.lockutils [req-fe6cba5b-5a1a-4e01-bd40-f70f31288223 req-cd220e39-27b6-4af6-aa3f-8c93bfb496d4 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "28677820-c1a2-4bbc-91d4-f2d7448eee33-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:10:37 np0005532763 nova_compute[231311]: 2025-11-23 21:10:37.422 231315 DEBUG nova.compute.manager [req-fe6cba5b-5a1a-4e01-bd40-f70f31288223 req-cd220e39-27b6-4af6-aa3f-8c93bfb496d4 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] No waiting events found dispatching network-vif-plugged-3b1f1868-aa98-47ef-be0e-37dd0abbfa44 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:10:37 np0005532763 nova_compute[231311]: 2025-11-23 21:10:37.423 231315 WARNING nova.compute.manager [req-fe6cba5b-5a1a-4e01-bd40-f70f31288223 req-cd220e39-27b6-4af6-aa3f-8c93bfb496d4 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Received unexpected event network-vif-plugged-3b1f1868-aa98-47ef-be0e-37dd0abbfa44 for instance with vm_state deleted and task_state None.#033[00m
Nov 23 16:10:37 np0005532763 nova_compute[231311]: 2025-11-23 21:10:37.426 231315 DEBUG nova.compute.manager [req-837160be-7eaf-49ae-85ab-204cd9cd3a81 req-f2ba9526-4b2f-43f4-8348-008bf0ba68c5 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Received event network-vif-deleted-3b1f1868-aa98-47ef-be0e-37dd0abbfa44 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:10:37 np0005532763 nova_compute[231311]: 2025-11-23 21:10:37.523 231315 DEBUG oslo_concurrency.processutils [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:10:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:38 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:10:38 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/788483409' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:10:38 np0005532763 nova_compute[231311]: 2025-11-23 21:10:38.068 231315 DEBUG oslo_concurrency.processutils [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.545s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:10:38 np0005532763 nova_compute[231311]: 2025-11-23 21:10:38.078 231315 DEBUG nova.compute.provider_tree [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:10:38 np0005532763 nova_compute[231311]: 2025-11-23 21:10:38.095 231315 DEBUG nova.scheduler.client.report [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:10:38 np0005532763 nova_compute[231311]: 2025-11-23 21:10:38.122 231315 DEBUG oslo_concurrency.lockutils [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.736s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:10:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:38 np0005532763 nova_compute[231311]: 2025-11-23 21:10:38.151 231315 INFO nova.scheduler.client.report [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Deleted allocations for instance 28677820-c1a2-4bbc-91d4-f2d7448eee33#033[00m
Nov 23 16:10:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:38.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:38 np0005532763 nova_compute[231311]: 2025-11-23 21:10:38.222 231315 DEBUG oslo_concurrency.lockutils [None req-f7e03c8e-ed7a-4312-a249-a22fd778611d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "28677820-c1a2-4bbc-91d4-f2d7448eee33" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.060s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:10:38 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:10:38 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:10:38 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:10:38 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:10:38 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:10:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:38.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:38 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:10:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:38 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:10:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:39 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:10:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:39 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:10:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:39 np0005532763 nova_compute[231311]: 2025-11-23 21:10:39.433 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:39 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:10:39 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:10:39 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:10:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:10:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:10:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:40.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:10:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:10:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:40.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:10:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:41 np0005532763 nova_compute[231311]: 2025-11-23 21:10:41.432 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:42.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:42.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:44 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:10:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:44 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:10:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:44 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:10:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:44 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:10:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:44.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:44 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:10:44 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:10:44 np0005532763 nova_compute[231311]: 2025-11-23 21:10:44.435 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:10:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:44.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:46.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:46 np0005532763 podman[238110]: 2025-11-23 21:10:46.23095001 +0000 UTC m=+0.102022332 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:10:46 np0005532763 nova_compute[231311]: 2025-11-23 21:10:46.459 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:46.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:48.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:48.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:48 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:10:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:48 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:10:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:48 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:10:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:49 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:10:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:49 np0005532763 nova_compute[231311]: 2025-11-23 21:10:49.405 231315 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763932234.4033012, 28677820-c1a2-4bbc-91d4-f2d7448eee33 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:10:49 np0005532763 nova_compute[231311]: 2025-11-23 21:10:49.406 231315 INFO nova.compute.manager [-] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] VM Stopped (Lifecycle Event)#033[00m
Nov 23 16:10:49 np0005532763 nova_compute[231311]: 2025-11-23 21:10:49.429 231315 DEBUG nova.compute.manager [None req-70951ab7-6f08-408b-a6fb-847b39ec171c - - - - - -] [instance: 28677820-c1a2-4bbc-91d4-f2d7448eee33] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:10:49 np0005532763 nova_compute[231311]: 2025-11-23 21:10:49.437 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:10:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:50.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:50 np0005532763 nova_compute[231311]: 2025-11-23 21:10:50.773 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:50 np0005532763 nova_compute[231311]: 2025-11-23 21:10:50.888 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:50.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:51 np0005532763 nova_compute[231311]: 2025-11-23 21:10:51.502 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:52.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:52.225 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:10:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:52.226 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:10:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:10:52.226 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:10:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:52.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:53 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:10:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:53 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:10:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:54 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:10:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:54 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:10:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:54.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:54 np0005532763 podman[238139]: 2025-11-23 21:10:54.280165117 +0000 UTC m=+0.152088577 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 23 16:10:54 np0005532763 nova_compute[231311]: 2025-11-23 21:10:54.438 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:10:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:55.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:10:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:56.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:10:56 np0005532763 nova_compute[231311]: 2025-11-23 21:10:56.540 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:57.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:57 np0005532763 podman[238168]: 2025-11-23 21:10:57.210437063 +0000 UTC m=+0.084293319 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, tcib_managed=true)
Nov 23 16:10:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:10:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:10:58.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:10:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:58 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:10:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:58 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:10:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:58 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:10:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:10:59 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:10:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:10:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:10:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:10:59.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:10:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:10:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:10:59 np0005532763 nova_compute[231311]: 2025-11-23 21:10:59.440 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:10:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:10:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:10:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:11:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:00.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:11:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:01.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:01 np0005532763 nova_compute[231311]: 2025-11-23 21:11:01.562 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:02.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:03.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:03 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:11:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:03 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:11:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:03 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:11:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:04 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:11:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:04.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:04 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 16:11:04 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 5472 writes, 28K keys, 5472 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s#012Cumulative WAL: 5472 writes, 5472 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1525 writes, 6964 keys, 1525 commit groups, 1.0 writes per commit group, ingest: 16.65 MB, 0.03 MB/s#012Interval WAL: 1525 writes, 1525 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    147.5      0.28              0.16        14    0.020       0      0       0.0       0.0#012  L6      1/0   12.44 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.2    178.8    153.3      1.10              0.58        13    0.085     67K   6876       0.0       0.0#012 Sum      1/0   12.44 MB   0.0      0.2     0.0      0.2       0.2      0.1       0.0   5.2    143.1    152.2      1.38              0.74        27    0.051     67K   6876       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   8.2    115.6    115.0      0.52              0.22         8    0.065     23K   2050       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    178.8    153.3      1.10              0.58        13    0.085     67K   6876       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    148.8      0.27              0.16        13    0.021       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.040, interval 0.007#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.20 GB write, 0.12 MB/s write, 0.19 GB read, 0.11 MB/s read, 1.4 seconds#012Interval compaction: 0.06 GB write, 0.10 MB/s write, 0.06 GB read, 0.10 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e7d0d09350#2 capacity: 304.00 MB usage: 14.46 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000205 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(781,13.91 MB,4.57566%) FilterBlock(27,201.42 KB,0.0647043%) IndexBlock(27,359.39 KB,0.11545%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Nov 23 16:11:04 np0005532763 nova_compute[231311]: 2025-11-23 21:11:04.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:11:04 np0005532763 nova_compute[231311]: 2025-11-23 21:11:04.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:11:04 np0005532763 nova_compute[231311]: 2025-11-23 21:11:04.443 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:11:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:05.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:05 np0005532763 nova_compute[231311]: 2025-11-23 21:11:05.378 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:11:05 np0005532763 nova_compute[231311]: 2025-11-23 21:11:05.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:11:05 np0005532763 nova_compute[231311]: 2025-11-23 21:11:05.382 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:11:05 np0005532763 nova_compute[231311]: 2025-11-23 21:11:05.382 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:11:05 np0005532763 nova_compute[231311]: 2025-11-23 21:11:05.395 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:11:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:06.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:06 np0005532763 nova_compute[231311]: 2025-11-23 21:11:06.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:11:06 np0005532763 nova_compute[231311]: 2025-11-23 21:11:06.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:11:06 np0005532763 nova_compute[231311]: 2025-11-23 21:11:06.602 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:07.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:07 np0005532763 nova_compute[231311]: 2025-11-23 21:11:07.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:11:07 np0005532763 nova_compute[231311]: 2025-11-23 21:11:07.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:11:07 np0005532763 nova_compute[231311]: 2025-11-23 21:11:07.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:11:07 np0005532763 nova_compute[231311]: 2025-11-23 21:11:07.385 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:11:07 np0005532763 nova_compute[231311]: 2025-11-23 21:11:07.412 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:11:07 np0005532763 nova_compute[231311]: 2025-11-23 21:11:07.413 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:11:07 np0005532763 nova_compute[231311]: 2025-11-23 21:11:07.413 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:11:07 np0005532763 nova_compute[231311]: 2025-11-23 21:11:07.413 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:11:07 np0005532763 nova_compute[231311]: 2025-11-23 21:11:07.414 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:11:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:11:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1354367898' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:11:07 np0005532763 nova_compute[231311]: 2025-11-23 21:11:07.913 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:11:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:11:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1250564092' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:11:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:11:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1250564092' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:11:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:08 np0005532763 nova_compute[231311]: 2025-11-23 21:11:08.184 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:11:08 np0005532763 nova_compute[231311]: 2025-11-23 21:11:08.186 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4903MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:11:08 np0005532763 nova_compute[231311]: 2025-11-23 21:11:08.186 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:11:08 np0005532763 nova_compute[231311]: 2025-11-23 21:11:08.187 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:11:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:11:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:08.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:11:08 np0005532763 nova_compute[231311]: 2025-11-23 21:11:08.256 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:11:08 np0005532763 nova_compute[231311]: 2025-11-23 21:11:08.257 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:11:08 np0005532763 nova_compute[231311]: 2025-11-23 21:11:08.275 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:11:08 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:11:08 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4229839563' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:11:08 np0005532763 nova_compute[231311]: 2025-11-23 21:11:08.759 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:11:08 np0005532763 nova_compute[231311]: 2025-11-23 21:11:08.767 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:11:08 np0005532763 nova_compute[231311]: 2025-11-23 21:11:08.780 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:11:08 np0005532763 nova_compute[231311]: 2025-11-23 21:11:08.808 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:11:08 np0005532763 nova_compute[231311]: 2025-11-23 21:11:08.808 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:11:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:11:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:11:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:11:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:09 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:11:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:11:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:09.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:11:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:09 np0005532763 nova_compute[231311]: 2025-11-23 21:11:09.445 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:11:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:10.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:11.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:11 np0005532763 nova_compute[231311]: 2025-11-23 21:11:11.641 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:11:11.758049) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932271758911, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2384, "num_deletes": 251, "total_data_size": 6220592, "memory_usage": 6294776, "flush_reason": "Manual Compaction"}
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932271779707, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 4048390, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 26189, "largest_seqno": 28568, "table_properties": {"data_size": 4038870, "index_size": 5950, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19848, "raw_average_key_size": 20, "raw_value_size": 4019905, "raw_average_value_size": 4118, "num_data_blocks": 261, "num_entries": 976, "num_filter_entries": 976, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763932059, "oldest_key_time": 1763932059, "file_creation_time": 1763932271, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 21722 microseconds, and 14174 cpu microseconds.
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:11:11.779781) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 4048390 bytes OK
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:11:11.779807) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:11:11.781596) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:11:11.781618) EVENT_LOG_v1 {"time_micros": 1763932271781611, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:11:11.781644) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 6210193, prev total WAL file size 6210193, number of live WAL files 2.
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:11:11.784099) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3953KB)], [51(12MB)]
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932271784188, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 17090922, "oldest_snapshot_seqno": -1}
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5846 keys, 14928350 bytes, temperature: kUnknown
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932271857899, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 14928350, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14888275, "index_size": 24349, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14661, "raw_key_size": 148710, "raw_average_key_size": 25, "raw_value_size": 14781867, "raw_average_value_size": 2528, "num_data_blocks": 994, "num_entries": 5846, "num_filter_entries": 5846, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763932271, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:11:11.858465) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 14928350 bytes
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:11:11.859952) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 231.4 rd, 202.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 12.4 +0.0 blob) out(14.2 +0.0 blob), read-write-amplify(7.9) write-amplify(3.7) OK, records in: 6364, records dropped: 518 output_compression: NoCompression
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:11:11.859988) EVENT_LOG_v1 {"time_micros": 1763932271859971, "job": 30, "event": "compaction_finished", "compaction_time_micros": 73846, "compaction_time_cpu_micros": 53045, "output_level": 6, "num_output_files": 1, "total_output_size": 14928350, "num_input_records": 6364, "num_output_records": 5846, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932271861444, "job": 30, "event": "table_file_deletion", "file_number": 53}
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932271866207, "job": 30, "event": "table_file_deletion", "file_number": 51}
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:11:11.783896) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:11:11.866344) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:11:11.866353) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:11:11.866356) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:11:11.866359) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:11:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:11:11.866362) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:11:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:12.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:13.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:13 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:11:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:14 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:11:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:14 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:11:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:14 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:11:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:14.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:14 np0005532763 nova_compute[231311]: 2025-11-23 21:11:14.448 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:11:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:15.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:16.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:16 np0005532763 nova_compute[231311]: 2025-11-23 21:11:16.645 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:17.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:17 np0005532763 podman[238279]: 2025-11-23 21:11:17.215366925 +0000 UTC m=+0.081816969 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 16:11:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:18.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:18 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:11:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:19 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:11:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:19 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:11:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:19 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:11:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:11:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:19.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:11:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:19 np0005532763 nova_compute[231311]: 2025-11-23 21:11:19.450 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:11:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:20.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:21.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:21 np0005532763 nova_compute[231311]: 2025-11-23 21:11:21.646 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:22.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:23.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:23 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:11:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:23 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:11:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:23 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:11:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:24 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:11:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:24.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:24 np0005532763 nova_compute[231311]: 2025-11-23 21:11:24.451 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:11:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:25.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:25 np0005532763 podman[238331]: 2025-11-23 21:11:25.289819746 +0000 UTC m=+0.174811659 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 16:11:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:11:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:26.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:11:26 np0005532763 nova_compute[231311]: 2025-11-23 21:11:26.647 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:26 np0005532763 ovn_controller[133425]: 2025-11-23T21:11:26Z|00047|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 23 16:11:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:11:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:27.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:11:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:28 np0005532763 podman[238363]: 2025-11-23 21:11:28.222027595 +0000 UTC m=+0.091561920 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 23 16:11:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:28.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:28 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:11:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:28 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:11:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:28 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:11:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:29 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:11:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:11:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:29.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:11:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:29 np0005532763 nova_compute[231311]: 2025-11-23 21:11:29.453 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:11:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:30.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:31.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:31 np0005532763 nova_compute[231311]: 2025-11-23 21:11:31.649 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:11:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:32.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:11:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:33.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:33 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:11:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:33 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:11:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:33 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:11:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:34 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:11:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:34.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:34 np0005532763 nova_compute[231311]: 2025-11-23 21:11:34.456 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:11:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:35.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:36.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:36 np0005532763 nova_compute[231311]: 2025-11-23 21:11:36.653 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:36 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:36.734 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3a:26:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:d5:4d:db:d5:2b'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:11:36 np0005532763 nova_compute[231311]: 2025-11-23 21:11:36.735 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:36 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:36.736 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 23 16:11:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:37.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:38.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:38 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:11:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:38 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:11:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:38 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:11:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:39 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:11:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:39.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:39 np0005532763 nova_compute[231311]: 2025-11-23 21:11:39.458 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 16:11:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 8518 writes, 34K keys, 8518 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 8518 writes, 2145 syncs, 3.97 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2417 writes, 8796 keys, 2417 commit groups, 1.0 writes per commit group, ingest: 9.65 MB, 0.02 MB/s#012Interval WAL: 2417 writes, 987 syncs, 2.45 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 16:11:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:11:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:40.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:41.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:41 np0005532763 nova_compute[231311]: 2025-11-23 21:11:41.686 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:42.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:43.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:43 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:43.739 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10e3bf57-dd2d-4b94-851f-925bcd297dde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:11:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:43 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:11:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:43 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:11:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:43 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:11:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:44 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:11:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:44.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:44 np0005532763 nova_compute[231311]: 2025-11-23 21:11:44.460 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:11:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:45.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:45 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:11:45 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:11:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:46.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:46 np0005532763 podman[238674]: 2025-11-23 21:11:46.440098014 +0000 UTC m=+0.074313871 container create 36e3b914ac68bd2049f378b3da7cd336fc30887dd7610f7d553aaa273dae8b33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_poincare, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 23 16:11:46 np0005532763 systemd[1]: Started libpod-conmon-36e3b914ac68bd2049f378b3da7cd336fc30887dd7610f7d553aaa273dae8b33.scope.
Nov 23 16:11:46 np0005532763 podman[238674]: 2025-11-23 21:11:46.409524852 +0000 UTC m=+0.043740759 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 16:11:46 np0005532763 systemd[1]: Started libcrun container.
Nov 23 16:11:46 np0005532763 podman[238674]: 2025-11-23 21:11:46.562353668 +0000 UTC m=+0.196569535 container init 36e3b914ac68bd2049f378b3da7cd336fc30887dd7610f7d553aaa273dae8b33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_poincare, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 23 16:11:46 np0005532763 podman[238674]: 2025-11-23 21:11:46.573969282 +0000 UTC m=+0.208185149 container start 36e3b914ac68bd2049f378b3da7cd336fc30887dd7610f7d553aaa273dae8b33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_poincare, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 16:11:46 np0005532763 podman[238674]: 2025-11-23 21:11:46.579107365 +0000 UTC m=+0.213323222 container attach 36e3b914ac68bd2049f378b3da7cd336fc30887dd7610f7d553aaa273dae8b33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_poincare, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 23 16:11:46 np0005532763 priceless_poincare[238691]: 167 167
Nov 23 16:11:46 np0005532763 systemd[1]: libpod-36e3b914ac68bd2049f378b3da7cd336fc30887dd7610f7d553aaa273dae8b33.scope: Deactivated successfully.
Nov 23 16:11:46 np0005532763 conmon[238691]: conmon 36e3b914ac68bd2049f3 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-36e3b914ac68bd2049f378b3da7cd336fc30887dd7610f7d553aaa273dae8b33.scope/container/memory.events
Nov 23 16:11:46 np0005532763 podman[238674]: 2025-11-23 21:11:46.586050358 +0000 UTC m=+0.220266205 container died 36e3b914ac68bd2049f378b3da7cd336fc30887dd7610f7d553aaa273dae8b33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_poincare, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Nov 23 16:11:46 np0005532763 systemd[1]: var-lib-containers-storage-overlay-bf771527b83a19db55f1cf2c07a422050cb8768fa08b04d6fef5d10333220a2c-merged.mount: Deactivated successfully.
Nov 23 16:11:46 np0005532763 podman[238674]: 2025-11-23 21:11:46.643067376 +0000 UTC m=+0.277283233 container remove 36e3b914ac68bd2049f378b3da7cd336fc30887dd7610f7d553aaa273dae8b33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=priceless_poincare, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Nov 23 16:11:46 np0005532763 systemd[1]: libpod-conmon-36e3b914ac68bd2049f378b3da7cd336fc30887dd7610f7d553aaa273dae8b33.scope: Deactivated successfully.
Nov 23 16:11:46 np0005532763 nova_compute[231311]: 2025-11-23 21:11:46.714 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:46 np0005532763 podman[238715]: 2025-11-23 21:11:46.87987437 +0000 UTC m=+0.070811823 container create a65f739b33a412e432a45e05e88c443c38000e66e32c2b9d4b5da4d161a89834 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_solomon, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2)
Nov 23 16:11:46 np0005532763 systemd[1]: Started libpod-conmon-a65f739b33a412e432a45e05e88c443c38000e66e32c2b9d4b5da4d161a89834.scope.
Nov 23 16:11:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:46 np0005532763 podman[238715]: 2025-11-23 21:11:46.853747532 +0000 UTC m=+0.044685035 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Nov 23 16:11:46 np0005532763 systemd[1]: Started libcrun container.
Nov 23 16:11:46 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8a331e46b8521ad0980707f693f8d7341eaee64d15fdd09a40b86130b1e0e99/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Nov 23 16:11:46 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8a331e46b8521ad0980707f693f8d7341eaee64d15fdd09a40b86130b1e0e99/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Nov 23 16:11:46 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8a331e46b8521ad0980707f693f8d7341eaee64d15fdd09a40b86130b1e0e99/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Nov 23 16:11:46 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8a331e46b8521ad0980707f693f8d7341eaee64d15fdd09a40b86130b1e0e99/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Nov 23 16:11:47 np0005532763 podman[238715]: 2025-11-23 21:11:46.999894222 +0000 UTC m=+0.190831705 container init a65f739b33a412e432a45e05e88c443c38000e66e32c2b9d4b5da4d161a89834 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_solomon, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 23 16:11:47 np0005532763 podman[238715]: 2025-11-23 21:11:47.011966748 +0000 UTC m=+0.202904201 container start a65f739b33a412e432a45e05e88c443c38000e66e32c2b9d4b5da4d161a89834 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_solomon, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Nov 23 16:11:47 np0005532763 podman[238715]: 2025-11-23 21:11:47.015884827 +0000 UTC m=+0.206822330 container attach a65f739b33a412e432a45e05e88c443c38000e66e32c2b9d4b5da4d161a89834 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_solomon, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Nov 23 16:11:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:47.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:47 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:11:47 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:11:47 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:11:47 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]: [
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:    {
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:        "available": false,
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:        "being_replaced": false,
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:        "ceph_device_lvm": false,
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:        "device_id": "QEMU_DVD-ROM_QM00001",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:        "lsm_data": {},
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:        "lvs": [],
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:        "path": "/dev/sr0",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:        "rejected_reasons": [
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "Insufficient space (<5GB)",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "Has a FileSystem"
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:        ],
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:        "sys_api": {
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "actuators": null,
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "device_nodes": [
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:                "sr0"
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            ],
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "devname": "sr0",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "human_readable_size": "482.00 KB",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "id_bus": "ata",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "model": "QEMU DVD-ROM",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "nr_requests": "2",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "parent": "/dev/sr0",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "partitions": {},
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "path": "/dev/sr0",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "removable": "1",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "rev": "2.5+",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "ro": "0",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "rotational": "1",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "sas_address": "",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "sas_device_handle": "",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "scheduler_mode": "mq-deadline",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "sectors": 0,
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "sectorsize": "2048",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "size": 493568.0,
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "support_discard": "2048",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "type": "disk",
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:            "vendor": "QEMU"
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:        }
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]:    }
Nov 23 16:11:47 np0005532763 sweet_solomon[238732]: ]
Nov 23 16:11:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:47 np0005532763 systemd[1]: libpod-a65f739b33a412e432a45e05e88c443c38000e66e32c2b9d4b5da4d161a89834.scope: Deactivated successfully.
Nov 23 16:11:47 np0005532763 podman[238715]: 2025-11-23 21:11:47.951896642 +0000 UTC m=+1.142834085 container died a65f739b33a412e432a45e05e88c443c38000e66e32c2b9d4b5da4d161a89834 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_solomon, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 23 16:11:47 np0005532763 systemd[1]: var-lib-containers-storage-overlay-d8a331e46b8521ad0980707f693f8d7341eaee64d15fdd09a40b86130b1e0e99-merged.mount: Deactivated successfully.
Nov 23 16:11:48 np0005532763 podman[238715]: 2025-11-23 21:11:48.010787561 +0000 UTC m=+1.201725004 container remove a65f739b33a412e432a45e05e88c443c38000e66e32c2b9d4b5da4d161a89834 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sweet_solomon, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Nov 23 16:11:48 np0005532763 systemd[1]: libpod-conmon-a65f739b33a412e432a45e05e88c443c38000e66e32c2b9d4b5da4d161a89834.scope: Deactivated successfully.
Nov 23 16:11:48 np0005532763 podman[239884]: 2025-11-23 21:11:48.064513467 +0000 UTC m=+0.076885711 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 23 16:11:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:48.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:48 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:11:48 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:11:48 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:11:48 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:11:48 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:11:48 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:11:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:48 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:11:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:49 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:11:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:49 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:11:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:49 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:11:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:49.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:49 np0005532763 nova_compute[231311]: 2025-11-23 21:11:49.462 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:11:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:50.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:51.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:51 np0005532763 nova_compute[231311]: 2025-11-23 21:11:51.717 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:51 np0005532763 nova_compute[231311]: 2025-11-23 21:11:51.739 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "06b47618-b4a9-4de4-94aa-b97241ff094c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:11:51 np0005532763 nova_compute[231311]: 2025-11-23 21:11:51.740 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:11:51 np0005532763 nova_compute[231311]: 2025-11-23 21:11:51.754 231315 DEBUG nova.compute.manager [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 23 16:11:51 np0005532763 nova_compute[231311]: 2025-11-23 21:11:51.835 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:11:51 np0005532763 nova_compute[231311]: 2025-11-23 21:11:51.835 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:11:51 np0005532763 nova_compute[231311]: 2025-11-23 21:11:51.844 231315 DEBUG nova.virt.hardware [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 23 16:11:51 np0005532763 nova_compute[231311]: 2025-11-23 21:11:51.845 231315 INFO nova.compute.claims [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 23 16:11:51 np0005532763 nova_compute[231311]: 2025-11-23 21:11:51.936 231315 DEBUG oslo_concurrency.processutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:11:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:52.225 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:11:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:52.226 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:11:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:52.226 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:11:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:52.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:52 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:11:52 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/759690542' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.370 231315 DEBUG oslo_concurrency.processutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.379 231315 DEBUG nova.compute.provider_tree [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.443 231315 DEBUG nova.scheduler.client.report [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.468 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.633s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.469 231315 DEBUG nova.compute.manager [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.519 231315 DEBUG nova.compute.manager [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.520 231315 DEBUG nova.network.neutron [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.539 231315 INFO nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.558 231315 DEBUG nova.compute.manager [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.669 231315 DEBUG nova.compute.manager [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.671 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.672 231315 INFO nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Creating image(s)#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.712 231315 DEBUG nova.storage.rbd_utils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 06b47618-b4a9-4de4-94aa-b97241ff094c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.741 231315 DEBUG nova.storage.rbd_utils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 06b47618-b4a9-4de4-94aa-b97241ff094c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.768 231315 DEBUG nova.storage.rbd_utils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 06b47618-b4a9-4de4-94aa-b97241ff094c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.772 231315 DEBUG oslo_concurrency.processutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.858 231315 DEBUG oslo_concurrency.processutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.859 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.860 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.861 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.895 231315 DEBUG nova.storage.rbd_utils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 06b47618-b4a9-4de4-94aa-b97241ff094c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:11:52 np0005532763 nova_compute[231311]: 2025-11-23 21:11:52.900 231315 DEBUG oslo_concurrency.processutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 06b47618-b4a9-4de4-94aa-b97241ff094c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:11:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:53 np0005532763 nova_compute[231311]: 2025-11-23 21:11:53.094 231315 DEBUG nova.policy [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9fb5352c62684f2ba3a326a953a10dfe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '782593db60784ab8bff41fe87d72ff5f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 23 16:11:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:53.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:53 np0005532763 nova_compute[231311]: 2025-11-23 21:11:53.247 231315 DEBUG oslo_concurrency.processutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 06b47618-b4a9-4de4-94aa-b97241ff094c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.347s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:11:53 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:11:53 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:11:53 np0005532763 nova_compute[231311]: 2025-11-23 21:11:53.354 231315 DEBUG nova.storage.rbd_utils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] resizing rbd image 06b47618-b4a9-4de4-94aa-b97241ff094c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Nov 23 16:11:53 np0005532763 nova_compute[231311]: 2025-11-23 21:11:53.492 231315 DEBUG nova.objects.instance [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'migration_context' on Instance uuid 06b47618-b4a9-4de4-94aa-b97241ff094c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:11:53 np0005532763 nova_compute[231311]: 2025-11-23 21:11:53.514 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 23 16:11:53 np0005532763 nova_compute[231311]: 2025-11-23 21:11:53.515 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Ensure instance console log exists: /var/lib/nova/instances/06b47618-b4a9-4de4-94aa-b97241ff094c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 23 16:11:53 np0005532763 nova_compute[231311]: 2025-11-23 21:11:53.516 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:11:53 np0005532763 nova_compute[231311]: 2025-11-23 21:11:53.517 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:11:53 np0005532763 nova_compute[231311]: 2025-11-23 21:11:53.517 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:11:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:53 np0005532763 nova_compute[231311]: 2025-11-23 21:11:53.978 231315 DEBUG nova.network.neutron [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Successfully created port: 6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 23 16:11:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:53 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:11:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:54 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:11:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:54 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:11:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:54 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:11:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:54.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:54 np0005532763 nova_compute[231311]: 2025-11-23 21:11:54.463 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:11:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:55.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:55 np0005532763 nova_compute[231311]: 2025-11-23 21:11:55.296 231315 DEBUG nova.network.neutron [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Successfully updated port: 6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 23 16:11:55 np0005532763 nova_compute[231311]: 2025-11-23 21:11:55.315 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "refresh_cache-06b47618-b4a9-4de4-94aa-b97241ff094c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:11:55 np0005532763 nova_compute[231311]: 2025-11-23 21:11:55.316 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquired lock "refresh_cache-06b47618-b4a9-4de4-94aa-b97241ff094c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:11:55 np0005532763 nova_compute[231311]: 2025-11-23 21:11:55.316 231315 DEBUG nova.network.neutron [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 23 16:11:55 np0005532763 nova_compute[231311]: 2025-11-23 21:11:55.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:11:55 np0005532763 nova_compute[231311]: 2025-11-23 21:11:55.399 231315 DEBUG nova.compute.manager [req-266035ef-75b0-4488-8731-7b86eeff4015 req-533722d1-f3a6-4dd9-a246-52b9778e3c1a 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Received event network-changed-6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:11:55 np0005532763 nova_compute[231311]: 2025-11-23 21:11:55.400 231315 DEBUG nova.compute.manager [req-266035ef-75b0-4488-8731-7b86eeff4015 req-533722d1-f3a6-4dd9-a246-52b9778e3c1a 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Refreshing instance network info cache due to event network-changed-6840e90e-b1cd-46fd-b6c2-5a573ed8dde5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 23 16:11:55 np0005532763 nova_compute[231311]: 2025-11-23 21:11:55.401 231315 DEBUG oslo_concurrency.lockutils [req-266035ef-75b0-4488-8731-7b86eeff4015 req-533722d1-f3a6-4dd9-a246-52b9778e3c1a 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "refresh_cache-06b47618-b4a9-4de4-94aa-b97241ff094c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:11:55 np0005532763 nova_compute[231311]: 2025-11-23 21:11:55.480 231315 DEBUG nova.network.neutron [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 23 16:11:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:56 np0005532763 podman[240135]: 2025-11-23 21:11:56.258444877 +0000 UTC m=+0.130002841 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 23 16:11:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:56.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.524 231315 DEBUG nova.network.neutron [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Updating instance_info_cache with network_info: [{"id": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "address": "fa:16:3e:f8:3f:80", "network": {"id": "a53cafa8-a74e-467c-9117-a31bd6c650ae", "bridge": "br-int", "label": "tempest-network-smoke--511994107", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6840e90e-b1", "ovs_interfaceid": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.540 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Releasing lock "refresh_cache-06b47618-b4a9-4de4-94aa-b97241ff094c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.540 231315 DEBUG nova.compute.manager [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Instance network_info: |[{"id": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "address": "fa:16:3e:f8:3f:80", "network": {"id": "a53cafa8-a74e-467c-9117-a31bd6c650ae", "bridge": "br-int", "label": "tempest-network-smoke--511994107", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6840e90e-b1", "ovs_interfaceid": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.541 231315 DEBUG oslo_concurrency.lockutils [req-266035ef-75b0-4488-8731-7b86eeff4015 req-533722d1-f3a6-4dd9-a246-52b9778e3c1a 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquired lock "refresh_cache-06b47618-b4a9-4de4-94aa-b97241ff094c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.541 231315 DEBUG nova.network.neutron [req-266035ef-75b0-4488-8731-7b86eeff4015 req-533722d1-f3a6-4dd9-a246-52b9778e3c1a 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Refreshing network info cache for port 6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.546 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Start _get_guest_xml network_info=[{"id": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "address": "fa:16:3e:f8:3f:80", "network": {"id": "a53cafa8-a74e-467c-9117-a31bd6c650ae", "bridge": "br-int", "label": "tempest-network-smoke--511994107", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6840e90e-b1", "ovs_interfaceid": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-23T21:05:43Z,direct_url=<?>,disk_format='qcow2',id=3c45fa6c-8a99-4359-a34e-d89f4e1e77d0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3f8fb5175f85402ba20cf9c6989d47cf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-23T21:05:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'encryption_options': None, 'size': 0, 'image_id': '3c45fa6c-8a99-4359-a34e-d89f4e1e77d0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.553 231315 WARNING nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.558 231315 DEBUG nova.virt.libvirt.host [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.559 231315 DEBUG nova.virt.libvirt.host [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.567 231315 DEBUG nova.virt.libvirt.host [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.568 231315 DEBUG nova.virt.libvirt.host [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.569 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.569 231315 DEBUG nova.virt.hardware [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-23T21:05:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='56044b93-2979-48aa-b67f-c37e1b489306',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-23T21:05:43Z,direct_url=<?>,disk_format='qcow2',id=3c45fa6c-8a99-4359-a34e-d89f4e1e77d0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3f8fb5175f85402ba20cf9c6989d47cf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-23T21:05:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.570 231315 DEBUG nova.virt.hardware [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.571 231315 DEBUG nova.virt.hardware [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.571 231315 DEBUG nova.virt.hardware [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.571 231315 DEBUG nova.virt.hardware [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.572 231315 DEBUG nova.virt.hardware [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.572 231315 DEBUG nova.virt.hardware [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.573 231315 DEBUG nova.virt.hardware [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.573 231315 DEBUG nova.virt.hardware [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.574 231315 DEBUG nova.virt.hardware [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.574 231315 DEBUG nova.virt.hardware [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.579 231315 DEBUG oslo_concurrency.processutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:11:56 np0005532763 nova_compute[231311]: 2025-11-23 21:11:56.772 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:57 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 16:11:57 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1825239050' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.071 231315 DEBUG oslo_concurrency.processutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:11:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:57.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.110 231315 DEBUG nova.storage.rbd_utils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 06b47618-b4a9-4de4-94aa-b97241ff094c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.115 231315 DEBUG oslo_concurrency.processutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:11:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:57 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 16:11:57 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1956243388' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.590 231315 DEBUG oslo_concurrency.processutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.593 231315 DEBUG nova.virt.libvirt.vif [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T21:11:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1151402468',display_name='tempest-TestNetworkBasicOps-server-1151402468',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1151402468',id=7,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOd9V/Qc+jZYb7RloX+DMFZ2y2px5S2r592+OMqtBAZfMl2Em9uz+jMlWKxAJG012CbASA7fOCn/CjEGt52Mes1DFgmLKPdgn3aP8AXdHCiAl9UyywwteZEGBeiwfxRB2Q==',key_name='tempest-TestNetworkBasicOps-166388418',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-axnp2ql9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-23T21:11:52Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=06b47618-b4a9-4de4-94aa-b97241ff094c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "address": "fa:16:3e:f8:3f:80", "network": {"id": "a53cafa8-a74e-467c-9117-a31bd6c650ae", "bridge": "br-int", "label": "tempest-network-smoke--511994107", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6840e90e-b1", "ovs_interfaceid": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.593 231315 DEBUG nova.network.os_vif_util [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "address": "fa:16:3e:f8:3f:80", "network": {"id": "a53cafa8-a74e-467c-9117-a31bd6c650ae", "bridge": "br-int", "label": "tempest-network-smoke--511994107", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6840e90e-b1", "ovs_interfaceid": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.595 231315 DEBUG nova.network.os_vif_util [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:3f:80,bridge_name='br-int',has_traffic_filtering=True,id=6840e90e-b1cd-46fd-b6c2-5a573ed8dde5,network=Network(a53cafa8-a74e-467c-9117-a31bd6c650ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6840e90e-b1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.597 231315 DEBUG nova.objects.instance [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'pci_devices' on Instance uuid 06b47618-b4a9-4de4-94aa-b97241ff094c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.614 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] End _get_guest_xml xml=<domain type="kvm">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  <uuid>06b47618-b4a9-4de4-94aa-b97241ff094c</uuid>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  <name>instance-00000007</name>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  <memory>131072</memory>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  <vcpu>1</vcpu>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  <metadata>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <nova:name>tempest-TestNetworkBasicOps-server-1151402468</nova:name>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <nova:creationTime>2025-11-23 21:11:56</nova:creationTime>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <nova:flavor name="m1.nano">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        <nova:memory>128</nova:memory>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        <nova:disk>1</nova:disk>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        <nova:swap>0</nova:swap>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        <nova:ephemeral>0</nova:ephemeral>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        <nova:vcpus>1</nova:vcpus>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      </nova:flavor>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <nova:owner>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        <nova:user uuid="9fb5352c62684f2ba3a326a953a10dfe">tempest-TestNetworkBasicOps-1975357669-project-member</nova:user>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        <nova:project uuid="782593db60784ab8bff41fe87d72ff5f">tempest-TestNetworkBasicOps-1975357669</nova:project>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      </nova:owner>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <nova:root type="image" uuid="3c45fa6c-8a99-4359-a34e-d89f4e1e77d0"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <nova:ports>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        <nova:port uuid="6840e90e-b1cd-46fd-b6c2-5a573ed8dde5">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:          <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        </nova:port>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      </nova:ports>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    </nova:instance>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  </metadata>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  <sysinfo type="smbios">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <system>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <entry name="manufacturer">RDO</entry>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <entry name="product">OpenStack Compute</entry>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <entry name="serial">06b47618-b4a9-4de4-94aa-b97241ff094c</entry>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <entry name="uuid">06b47618-b4a9-4de4-94aa-b97241ff094c</entry>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <entry name="family">Virtual Machine</entry>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    </system>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  </sysinfo>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  <os>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <boot dev="hd"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <smbios mode="sysinfo"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  </os>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  <features>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <acpi/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <apic/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <vmcoreinfo/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  </features>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  <clock offset="utc">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <timer name="pit" tickpolicy="delay"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <timer name="hpet" present="no"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  </clock>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  <cpu mode="host-model" match="exact">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <topology sockets="1" cores="1" threads="1"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  </cpu>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  <devices>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <disk type="network" device="disk">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <driver type="raw" cache="none"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <source protocol="rbd" name="vms/06b47618-b4a9-4de4-94aa-b97241ff094c_disk">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        <host name="192.168.122.100" port="6789"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        <host name="192.168.122.102" port="6789"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        <host name="192.168.122.101" port="6789"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      </source>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <auth username="openstack">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        <secret type="ceph" uuid="03808be8-ae4a-5548-82e6-4a294f1bc627"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      </auth>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <target dev="vda" bus="virtio"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    </disk>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <disk type="network" device="cdrom">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <driver type="raw" cache="none"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <source protocol="rbd" name="vms/06b47618-b4a9-4de4-94aa-b97241ff094c_disk.config">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        <host name="192.168.122.100" port="6789"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        <host name="192.168.122.102" port="6789"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        <host name="192.168.122.101" port="6789"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      </source>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <auth username="openstack">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:        <secret type="ceph" uuid="03808be8-ae4a-5548-82e6-4a294f1bc627"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      </auth>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <target dev="sda" bus="sata"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    </disk>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <interface type="ethernet">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <mac address="fa:16:3e:f8:3f:80"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <model type="virtio"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <driver name="vhost" rx_queue_size="512"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <mtu size="1442"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <target dev="tap6840e90e-b1"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    </interface>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <serial type="pty">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <log file="/var/lib/nova/instances/06b47618-b4a9-4de4-94aa-b97241ff094c/console.log" append="off"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    </serial>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <video>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <model type="virtio"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    </video>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <input type="tablet" bus="usb"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <rng model="virtio">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <backend model="random">/dev/urandom</backend>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    </rng>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <controller type="usb" index="0"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    <memballoon model="virtio">
Nov 23 16:11:57 np0005532763 nova_compute[231311]:      <stats period="10"/>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:    </memballoon>
Nov 23 16:11:57 np0005532763 nova_compute[231311]:  </devices>
Nov 23 16:11:57 np0005532763 nova_compute[231311]: </domain>
Nov 23 16:11:57 np0005532763 nova_compute[231311]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.615 231315 DEBUG nova.compute.manager [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Preparing to wait for external event network-vif-plugged-6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.616 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.617 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.617 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.618 231315 DEBUG nova.virt.libvirt.vif [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T21:11:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1151402468',display_name='tempest-TestNetworkBasicOps-server-1151402468',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1151402468',id=7,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOd9V/Qc+jZYb7RloX+DMFZ2y2px5S2r592+OMqtBAZfMl2Em9uz+jMlWKxAJG012CbASA7fOCn/CjEGt52Mes1DFgmLKPdgn3aP8AXdHCiAl9UyywwteZEGBeiwfxRB2Q==',key_name='tempest-TestNetworkBasicOps-166388418',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-axnp2ql9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-23T21:11:52Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=06b47618-b4a9-4de4-94aa-b97241ff094c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "address": "fa:16:3e:f8:3f:80", "network": {"id": "a53cafa8-a74e-467c-9117-a31bd6c650ae", "bridge": "br-int", "label": "tempest-network-smoke--511994107", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6840e90e-b1", "ovs_interfaceid": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.619 231315 DEBUG nova.network.os_vif_util [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "address": "fa:16:3e:f8:3f:80", "network": {"id": "a53cafa8-a74e-467c-9117-a31bd6c650ae", "bridge": "br-int", "label": "tempest-network-smoke--511994107", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6840e90e-b1", "ovs_interfaceid": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.620 231315 DEBUG nova.network.os_vif_util [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:3f:80,bridge_name='br-int',has_traffic_filtering=True,id=6840e90e-b1cd-46fd-b6c2-5a573ed8dde5,network=Network(a53cafa8-a74e-467c-9117-a31bd6c650ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6840e90e-b1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.620 231315 DEBUG os_vif [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:3f:80,bridge_name='br-int',has_traffic_filtering=True,id=6840e90e-b1cd-46fd-b6c2-5a573ed8dde5,network=Network(a53cafa8-a74e-467c-9117-a31bd6c650ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6840e90e-b1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.622 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.622 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.623 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.627 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.628 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6840e90e-b1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.628 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6840e90e-b1, col_values=(('external_ids', {'iface-id': '6840e90e-b1cd-46fd-b6c2-5a573ed8dde5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f8:3f:80', 'vm-uuid': '06b47618-b4a9-4de4-94aa-b97241ff094c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.630 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.633 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 23 16:11:57 np0005532763 NetworkManager[48849]: <info>  [1763932317.6331] manager: (tap6840e90e-b1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.639 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.640 231315 INFO os_vif [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:3f:80,bridge_name='br-int',has_traffic_filtering=True,id=6840e90e-b1cd-46fd-b6c2-5a573ed8dde5,network=Network(a53cafa8-a74e-467c-9117-a31bd6c650ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6840e90e-b1')#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.702 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.703 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.703 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No VIF found with MAC fa:16:3e:f8:3f:80, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.704 231315 INFO nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Using config drive#033[00m
Nov 23 16:11:57 np0005532763 nova_compute[231311]: 2025-11-23 21:11:57.747 231315 DEBUG nova.storage.rbd_utils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 06b47618-b4a9-4de4-94aa-b97241ff094c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:11:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:11:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:11:58.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:11:58 np0005532763 nova_compute[231311]: 2025-11-23 21:11:58.356 231315 INFO nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Creating config drive at /var/lib/nova/instances/06b47618-b4a9-4de4-94aa-b97241ff094c/disk.config#033[00m
Nov 23 16:11:58 np0005532763 nova_compute[231311]: 2025-11-23 21:11:58.365 231315 DEBUG oslo_concurrency.processutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/06b47618-b4a9-4de4-94aa-b97241ff094c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0uxnv761 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:11:58 np0005532763 nova_compute[231311]: 2025-11-23 21:11:58.395 231315 DEBUG nova.network.neutron [req-266035ef-75b0-4488-8731-7b86eeff4015 req-533722d1-f3a6-4dd9-a246-52b9778e3c1a 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Updated VIF entry in instance network info cache for port 6840e90e-b1cd-46fd-b6c2-5a573ed8dde5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 23 16:11:58 np0005532763 nova_compute[231311]: 2025-11-23 21:11:58.396 231315 DEBUG nova.network.neutron [req-266035ef-75b0-4488-8731-7b86eeff4015 req-533722d1-f3a6-4dd9-a246-52b9778e3c1a 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Updating instance_info_cache with network_info: [{"id": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "address": "fa:16:3e:f8:3f:80", "network": {"id": "a53cafa8-a74e-467c-9117-a31bd6c650ae", "bridge": "br-int", "label": "tempest-network-smoke--511994107", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6840e90e-b1", "ovs_interfaceid": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:11:58 np0005532763 nova_compute[231311]: 2025-11-23 21:11:58.412 231315 DEBUG oslo_concurrency.lockutils [req-266035ef-75b0-4488-8731-7b86eeff4015 req-533722d1-f3a6-4dd9-a246-52b9778e3c1a 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Releasing lock "refresh_cache-06b47618-b4a9-4de4-94aa-b97241ff094c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:11:58 np0005532763 nova_compute[231311]: 2025-11-23 21:11:58.506 231315 DEBUG oslo_concurrency.processutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/06b47618-b4a9-4de4-94aa-b97241ff094c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0uxnv761" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:11:58 np0005532763 nova_compute[231311]: 2025-11-23 21:11:58.548 231315 DEBUG nova.storage.rbd_utils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 06b47618-b4a9-4de4-94aa-b97241ff094c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:11:58 np0005532763 nova_compute[231311]: 2025-11-23 21:11:58.552 231315 DEBUG oslo_concurrency.processutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/06b47618-b4a9-4de4-94aa-b97241ff094c/disk.config 06b47618-b4a9-4de4-94aa-b97241ff094c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:11:58 np0005532763 nova_compute[231311]: 2025-11-23 21:11:58.746 231315 DEBUG oslo_concurrency.processutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/06b47618-b4a9-4de4-94aa-b97241ff094c/disk.config 06b47618-b4a9-4de4-94aa-b97241ff094c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.194s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:11:58 np0005532763 nova_compute[231311]: 2025-11-23 21:11:58.748 231315 INFO nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Deleting local config drive /var/lib/nova/instances/06b47618-b4a9-4de4-94aa-b97241ff094c/disk.config because it was imported into RBD.#033[00m
Nov 23 16:11:58 np0005532763 kernel: tap6840e90e-b1: entered promiscuous mode
Nov 23 16:11:58 np0005532763 NetworkManager[48849]: <info>  [1763932318.8271] manager: (tap6840e90e-b1): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Nov 23 16:11:58 np0005532763 nova_compute[231311]: 2025-11-23 21:11:58.827 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:58 np0005532763 ovn_controller[133425]: 2025-11-23T21:11:58Z|00048|binding|INFO|Claiming lport 6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 for this chassis.
Nov 23 16:11:58 np0005532763 ovn_controller[133425]: 2025-11-23T21:11:58Z|00049|binding|INFO|6840e90e-b1cd-46fd-b6c2-5a573ed8dde5: Claiming fa:16:3e:f8:3f:80 10.100.0.25
Nov 23 16:11:58 np0005532763 nova_compute[231311]: 2025-11-23 21:11:58.837 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:58.853 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:3f:80 10.100.0.25'], port_security=['fa:16:3e:f8:3f:80 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '06b47618-b4a9-4de4-94aa-b97241ff094c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a53cafa8-a74e-467c-9117-a31bd6c650ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782593db60784ab8bff41fe87d72ff5f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '10befa5b-28b9-4956-82b3-968d8cb6ea4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c22c132b-3565-4344-9558-f1d93c19cb57, chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], logical_port=6840e90e-b1cd-46fd-b6c2-5a573ed8dde5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:11:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:58.856 142920 INFO neutron.agent.ovn.metadata.agent [-] Port 6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 in datapath a53cafa8-a74e-467c-9117-a31bd6c650ae bound to our chassis#033[00m
Nov 23 16:11:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:58.858 142920 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a53cafa8-a74e-467c-9117-a31bd6c650ae#033[00m
Nov 23 16:11:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:58.877 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[e5885bf0-ad22-4254-aade-286e36da9a4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:11:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:58.879 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa53cafa8-a1 in ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 23 16:11:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:58.883 235389 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa53cafa8-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 23 16:11:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:58.883 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[db8a2df9-e339-4cda-8e3e-103d86746816]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:11:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:58.885 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[6fb9a65d-0c58-4a57-ae78-6f2a8604cf49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:11:58 np0005532763 systemd-machined[194484]: New machine qemu-4-instance-00000007.
Nov 23 16:11:58 np0005532763 systemd-udevd[240308]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 16:11:58 np0005532763 systemd[1]: Started Virtual Machine qemu-4-instance-00000007.
Nov 23 16:11:58 np0005532763 ovn_controller[133425]: 2025-11-23T21:11:58Z|00050|binding|INFO|Setting lport 6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 ovn-installed in OVS
Nov 23 16:11:58 np0005532763 ovn_controller[133425]: 2025-11-23T21:11:58Z|00051|binding|INFO|Setting lport 6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 up in Southbound
Nov 23 16:11:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:58 np0005532763 nova_compute[231311]: 2025-11-23 21:11:58.962 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:58.961 143034 DEBUG oslo.privsep.daemon [-] privsep: reply[d5c88767-0792-4fdb-845e-8a2837760fcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:11:58 np0005532763 NetworkManager[48849]: <info>  [1763932318.9631] device (tap6840e90e-b1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 23 16:11:58 np0005532763 NetworkManager[48849]: <info>  [1763932318.9644] device (tap6840e90e-b1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 23 16:11:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:58.984 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[b234d9f2-122c-44fd-a830-e41fbf9c74f2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:11:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:58 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:11:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:58 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:11:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:58 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:11:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:11:59 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:11:59 np0005532763 podman[240297]: 2025-11-23 21:11:59.012138665 +0000 UTC m=+0.144381011 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.016 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[f6940bf3-ebad-438f-8a37-9ac335c4cdcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.024 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[f7f533eb-4fdd-473a-89ae-cf63811a8164]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:11:59 np0005532763 NetworkManager[48849]: <info>  [1763932319.0256] manager: (tapa53cafa8-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/40)
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.053 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[b25b4078-a060-440d-a95a-e02112e436e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.056 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[7c3e6b17-3e34-4f8a-9501-f4591ab6b1b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:11:59 np0005532763 NetworkManager[48849]: <info>  [1763932319.0784] device (tapa53cafa8-a0): carrier: link connected
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.083 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[2188f807-39ea-45dc-95b5-1e30d7430258]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.100 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[ef8d39f1-c3db-463c-be05-4d32fef65e08]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa53cafa8-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:b5:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426861, 'reachable_time': 36652, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240355, 'error': None, 'target': 'ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:11:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:11:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:11:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:11:59.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.117 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[2a0104da-d5e0-4b51-8ce7-53db80c35cfb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:b52b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 426861, 'tstamp': 426861}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240356, 'error': None, 'target': 'ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.133 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[6ae13acf-afab-4bac-8b94-731761ea9e16]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa53cafa8-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:b5:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426861, 'reachable_time': 36652, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240357, 'error': None, 'target': 'ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.172 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[6ec36bbe-2502-4c11-83f1-91fa834287fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:11:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:11:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.260 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[b5704e27-3464-4300-801a-5d3d4fcd320b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.261 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa53cafa8-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.261 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.262 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa53cafa8-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.265 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:59 np0005532763 NetworkManager[48849]: <info>  [1763932319.2661] manager: (tapa53cafa8-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Nov 23 16:11:59 np0005532763 kernel: tapa53cafa8-a0: entered promiscuous mode
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.269 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.270 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa53cafa8-a0, col_values=(('external_ids', {'iface-id': 'cab0b4e0-79b2-41b3-92b4-7053f2aab9f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.271 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:59 np0005532763 ovn_controller[133425]: 2025-11-23T21:11:59Z|00052|binding|INFO|Releasing lport cab0b4e0-79b2-41b3-92b4-7053f2aab9f8 from this chassis (sb_readonly=0)
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.300 231315 DEBUG nova.compute.manager [req-428d0abb-d689-410a-92af-eb2a31d53da6 req-bee1779f-8769-4381-b4bd-3f022b1c08ca 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Received event network-vif-plugged-6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.300 231315 DEBUG oslo_concurrency.lockutils [req-428d0abb-d689-410a-92af-eb2a31d53da6 req-bee1779f-8769-4381-b4bd-3f022b1c08ca 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.300 231315 DEBUG oslo_concurrency.lockutils [req-428d0abb-d689-410a-92af-eb2a31d53da6 req-bee1779f-8769-4381-b4bd-3f022b1c08ca 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.300 231315 DEBUG oslo_concurrency.lockutils [req-428d0abb-d689-410a-92af-eb2a31d53da6 req-bee1779f-8769-4381-b4bd-3f022b1c08ca 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.301 231315 DEBUG nova.compute.manager [req-428d0abb-d689-410a-92af-eb2a31d53da6 req-bee1779f-8769-4381-b4bd-3f022b1c08ca 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Processing event network-vif-plugged-6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.301 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.301 142920 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a53cafa8-a74e-467c-9117-a31bd6c650ae.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a53cafa8-a74e-467c-9117-a31bd6c650ae.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.302 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[9a91d66a-8772-4093-a4b4-e25f479eb3d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.303 142920 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: global
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    log         /dev/log local0 debug
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    log-tag     haproxy-metadata-proxy-a53cafa8-a74e-467c-9117-a31bd6c650ae
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    user        root
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    group       root
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    maxconn     1024
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    pidfile     /var/lib/neutron/external/pids/a53cafa8-a74e-467c-9117-a31bd6c650ae.pid.haproxy
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    daemon
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: defaults
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    log global
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    mode http
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    option httplog
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    option dontlognull
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    option http-server-close
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    option forwardfor
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    retries                 3
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    timeout http-request    30s
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    timeout connect         30s
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    timeout client          32s
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    timeout server          32s
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    timeout http-keep-alive 30s
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: listen listener
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    bind 169.254.169.254:80
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    server metadata /var/lib/neutron/metadata_proxy
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]:    http-request add-header X-OVN-Network-ID a53cafa8-a74e-467c-9117-a31bd6c650ae
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 23 16:11:59 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:11:59.304 142920 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae', 'env', 'PROCESS_TAG=haproxy-a53cafa8-a74e-467c-9117-a31bd6c650ae', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a53cafa8-a74e-467c-9117-a31bd6c650ae.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.410 231315 DEBUG nova.virt.driver [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Emitting event <LifecycleEvent: 1763932319.4102008, 06b47618-b4a9-4de4-94aa-b97241ff094c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.411 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] VM Started (Lifecycle Event)#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.414 231315 DEBUG nova.compute.manager [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.420 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.424 231315 INFO nova.virt.libvirt.driver [-] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Instance spawned successfully.#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.424 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.428 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.438 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.452 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.453 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.454 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.454 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.456 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.457 231315 DEBUG nova.virt.libvirt.driver [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.463 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.464 231315 DEBUG nova.virt.driver [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Emitting event <LifecycleEvent: 1763932319.4103966, 06b47618-b4a9-4de4-94aa-b97241ff094c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.465 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] VM Paused (Lifecycle Event)#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.491 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.494 231315 DEBUG nova.virt.driver [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Emitting event <LifecycleEvent: 1763932319.4176757, 06b47618-b4a9-4de4-94aa-b97241ff094c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.495 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] VM Resumed (Lifecycle Event)#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.522 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.526 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.531 231315 INFO nova.compute.manager [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Took 6.86 seconds to spawn the instance on the hypervisor.#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.532 231315 DEBUG nova.compute.manager [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.543 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.597 231315 INFO nova.compute.manager [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Took 7.79 seconds to build instance.#033[00m
Nov 23 16:11:59 np0005532763 nova_compute[231311]: 2025-11-23 21:11:59.609 231315 DEBUG oslo_concurrency.lockutils [None req-cdc7dc60-8ace-488d-9c4c-775a97580f0a 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.869s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:11:59 np0005532763 podman[240431]: 2025-11-23 21:11:59.758674603 +0000 UTC m=+0.089000899 container create 45d9ea17230c20f5f8ea4e3db155da147a7efcd643378fa7c684400258a8f804 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 23 16:11:59 np0005532763 podman[240431]: 2025-11-23 21:11:59.708489755 +0000 UTC m=+0.038816091 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 23 16:11:59 np0005532763 systemd[1]: Started libpod-conmon-45d9ea17230c20f5f8ea4e3db155da147a7efcd643378fa7c684400258a8f804.scope.
Nov 23 16:11:59 np0005532763 systemd[1]: Started libcrun container.
Nov 23 16:11:59 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/18ec784255dcd505044fd242be7ae66fc3d81ffffee187b36f235bfc4bea4f39/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 16:11:59 np0005532763 podman[240431]: 2025-11-23 21:11:59.85803144 +0000 UTC m=+0.188357756 container init 45d9ea17230c20f5f8ea4e3db155da147a7efcd643378fa7c684400258a8f804 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Nov 23 16:11:59 np0005532763 podman[240431]: 2025-11-23 21:11:59.86772904 +0000 UTC m=+0.198055336 container start 45d9ea17230c20f5f8ea4e3db155da147a7efcd643378fa7c684400258a8f804 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 16:11:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:11:59 np0005532763 neutron-haproxy-ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae[240447]: [NOTICE]   (240451) : New worker (240453) forked
Nov 23 16:11:59 np0005532763 neutron-haproxy-ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae[240447]: [NOTICE]   (240451) : Loading success.
Nov 23 16:11:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:11:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:00.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:01.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:01 np0005532763 nova_compute[231311]: 2025-11-23 21:12:01.399 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:12:01 np0005532763 nova_compute[231311]: 2025-11-23 21:12:01.413 231315 DEBUG nova.compute.manager [req-12640cef-48f0-4716-9d73-a9b3c7559ac8 req-70351f1a-1cee-4ce5-8e8e-d35f27c2bfbd 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Received event network-vif-plugged-6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:12:01 np0005532763 nova_compute[231311]: 2025-11-23 21:12:01.414 231315 DEBUG oslo_concurrency.lockutils [req-12640cef-48f0-4716-9d73-a9b3c7559ac8 req-70351f1a-1cee-4ce5-8e8e-d35f27c2bfbd 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:12:01 np0005532763 nova_compute[231311]: 2025-11-23 21:12:01.415 231315 DEBUG oslo_concurrency.lockutils [req-12640cef-48f0-4716-9d73-a9b3c7559ac8 req-70351f1a-1cee-4ce5-8e8e-d35f27c2bfbd 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:12:01 np0005532763 nova_compute[231311]: 2025-11-23 21:12:01.419 231315 DEBUG oslo_concurrency.lockutils [req-12640cef-48f0-4716-9d73-a9b3c7559ac8 req-70351f1a-1cee-4ce5-8e8e-d35f27c2bfbd 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:12:01 np0005532763 nova_compute[231311]: 2025-11-23 21:12:01.420 231315 DEBUG nova.compute.manager [req-12640cef-48f0-4716-9d73-a9b3c7559ac8 req-70351f1a-1cee-4ce5-8e8e-d35f27c2bfbd 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] No waiting events found dispatching network-vif-plugged-6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:12:01 np0005532763 nova_compute[231311]: 2025-11-23 21:12:01.420 231315 WARNING nova.compute.manager [req-12640cef-48f0-4716-9d73-a9b3c7559ac8 req-70351f1a-1cee-4ce5-8e8e-d35f27c2bfbd 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Received unexpected event network-vif-plugged-6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 for instance with vm_state active and task_state None.#033[00m
Nov 23 16:12:01 np0005532763 nova_compute[231311]: 2025-11-23 21:12:01.775 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000056s ======
Nov 23 16:12:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:02.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Nov 23 16:12:02 np0005532763 nova_compute[231311]: 2025-11-23 21:12:02.675 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:03.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:03 np0005532763 nova_compute[231311]: 2025-11-23 21:12:03.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:12:03 np0005532763 nova_compute[231311]: 2025-11-23 21:12:03.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 23 16:12:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:03 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:12:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:03 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:12:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:03 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:12:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:04 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:12:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:04.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:04 np0005532763 nova_compute[231311]: 2025-11-23 21:12:04.394 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:12:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:12:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:05.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:05 np0005532763 nova_compute[231311]: 2025-11-23 21:12:05.379 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:12:05 np0005532763 nova_compute[231311]: 2025-11-23 21:12:05.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:12:05 np0005532763 nova_compute[231311]: 2025-11-23 21:12:05.382 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:12:05 np0005532763 nova_compute[231311]: 2025-11-23 21:12:05.382 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:12:05 np0005532763 nova_compute[231311]: 2025-11-23 21:12:05.609 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "refresh_cache-06b47618-b4a9-4de4-94aa-b97241ff094c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:12:05 np0005532763 nova_compute[231311]: 2025-11-23 21:12:05.610 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquired lock "refresh_cache-06b47618-b4a9-4de4-94aa-b97241ff094c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:12:05 np0005532763 nova_compute[231311]: 2025-11-23 21:12:05.611 231315 DEBUG nova.network.neutron [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 23 16:12:05 np0005532763 nova_compute[231311]: 2025-11-23 21:12:05.611 231315 DEBUG nova.objects.instance [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 06b47618-b4a9-4de4-94aa-b97241ff094c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:12:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:06.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:06 np0005532763 nova_compute[231311]: 2025-11-23 21:12:06.778 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.055 231315 DEBUG nova.network.neutron [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Updating instance_info_cache with network_info: [{"id": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "address": "fa:16:3e:f8:3f:80", "network": {"id": "a53cafa8-a74e-467c-9117-a31bd6c650ae", "bridge": "br-int", "label": "tempest-network-smoke--511994107", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6840e90e-b1", "ovs_interfaceid": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.076 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Releasing lock "refresh_cache-06b47618-b4a9-4de4-94aa-b97241ff094c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.077 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.078 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.078 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.079 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.079 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.094 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 23 16:12:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:07.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.398 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.416 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.416 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.417 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.417 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.418 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.677 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:12:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4168597009' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:12:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:12:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4168597009' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:12:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:12:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/497318453' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.924 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:12:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.987 231315 DEBUG nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Nov 23 16:12:07 np0005532763 nova_compute[231311]: 2025-11-23 21:12:07.988 231315 DEBUG nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Nov 23 16:12:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:08 np0005532763 nova_compute[231311]: 2025-11-23 21:12:08.231 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:12:08 np0005532763 nova_compute[231311]: 2025-11-23 21:12:08.232 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4701MB free_disk=59.92176818847656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:12:08 np0005532763 nova_compute[231311]: 2025-11-23 21:12:08.233 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:12:08 np0005532763 nova_compute[231311]: 2025-11-23 21:12:08.234 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:12:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:08.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:08 np0005532763 nova_compute[231311]: 2025-11-23 21:12:08.357 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Instance 06b47618-b4a9-4de4-94aa-b97241ff094c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 23 16:12:08 np0005532763 nova_compute[231311]: 2025-11-23 21:12:08.357 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:12:08 np0005532763 nova_compute[231311]: 2025-11-23 21:12:08.358 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:12:08 np0005532763 nova_compute[231311]: 2025-11-23 21:12:08.506 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:12:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:08 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:12:08 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1749248425' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:12:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:12:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:12:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:08 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:12:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:09 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:12:09 np0005532763 nova_compute[231311]: 2025-11-23 21:12:09.007 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:12:09 np0005532763 nova_compute[231311]: 2025-11-23 21:12:09.015 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:12:09 np0005532763 nova_compute[231311]: 2025-11-23 21:12:09.047 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:12:09 np0005532763 nova_compute[231311]: 2025-11-23 21:12:09.067 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:12:09 np0005532763 nova_compute[231311]: 2025-11-23 21:12:09.067 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.834s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:12:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:09.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:12:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:10 np0005532763 nova_compute[231311]: 2025-11-23 21:12:10.053 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:12:10 np0005532763 nova_compute[231311]: 2025-11-23 21:12:10.053 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:12:10 np0005532763 nova_compute[231311]: 2025-11-23 21:12:10.053 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:12:10 np0005532763 nova_compute[231311]: 2025-11-23 21:12:10.054 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:12:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:10.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:11.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:11 np0005532763 nova_compute[231311]: 2025-11-23 21:12:11.814 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:12 np0005532763 ovn_controller[133425]: 2025-11-23T21:12:12Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f8:3f:80 10.100.0.25
Nov 23 16:12:12 np0005532763 ovn_controller[133425]: 2025-11-23T21:12:12Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f8:3f:80 10.100.0.25
Nov 23 16:12:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:12.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:12 np0005532763 nova_compute[231311]: 2025-11-23 21:12:12.680 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:13.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:13 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:12:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:13 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:12:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:13 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:12:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:14 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:12:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:14.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:12:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:15.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:16.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:16 np0005532763 nova_compute[231311]: 2025-11-23 21:12:16.866 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:17.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:17 np0005532763 nova_compute[231311]: 2025-11-23 21:12:17.683 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:18 np0005532763 podman[240550]: 2025-11-23 21:12:18.224591579 +0000 UTC m=+0.089576795 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 23 16:12:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:18.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:18 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:12:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:18 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:12:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:18 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:12:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:19 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:12:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:19.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:12:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:12:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:20.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:12:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:21.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:21 np0005532763 nova_compute[231311]: 2025-11-23 21:12:21.900 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000056s ======
Nov 23 16:12:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:22.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Nov 23 16:12:22 np0005532763 nova_compute[231311]: 2025-11-23 21:12:22.686 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:23.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:23 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:12:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:24 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:12:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:24 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:12:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:24 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:12:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:24.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:12:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:25.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:26.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:26 np0005532763 nova_compute[231311]: 2025-11-23 21:12:26.937 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:27.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:27 np0005532763 podman[240606]: 2025-11-23 21:12:27.263641798 +0000 UTC m=+0.137278923 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 16:12:27 np0005532763 nova_compute[231311]: 2025-11-23 21:12:27.689 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:28.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:28 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:12:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:29 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:12:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:29 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:12:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:29 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:12:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:29.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:29 np0005532763 podman[240634]: 2025-11-23 21:12:29.217695609 +0000 UTC m=+0.094179884 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2)
Nov 23 16:12:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:12:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:30.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:31.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:32 np0005532763 nova_compute[231311]: 2025-11-23 21:12:32.003 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:32.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:32 np0005532763 nova_compute[231311]: 2025-11-23 21:12:32.692 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [WARNING] 326/211232 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Nov 23 16:12:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem[85726]: [ALERT] 326/211232 (4) : backend 'backend' has no server available!
Nov 23 16:12:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:33.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:33 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:12:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:33 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:12:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:33 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:12:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:34 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:12:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:34.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:12:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:35.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:36.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:37 np0005532763 nova_compute[231311]: 2025-11-23 21:12:37.036 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:37.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:37 np0005532763 nova_compute[231311]: 2025-11-23 21:12:37.694 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:12:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:38.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:12:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:38 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:12:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:38 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:12:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:38 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:12:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:39 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:12:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:39 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:12:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:39 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:12:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:39 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:12:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:12:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:39.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:12:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:39 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:12:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:12:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:40.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:41.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:42 np0005532763 nova_compute[231311]: 2025-11-23 21:12:42.074 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:12:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:12:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:12:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:12:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:42.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:42 np0005532763 nova_compute[231311]: 2025-11-23 21:12:42.696 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:43.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:44.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:12:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:45.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:45 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Nov 23 16:12:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:45 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Nov 23 16:12:45 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Nov 23 16:12:45 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Nov 23 16:12:45 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Nov 23 16:12:45 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Nov 23 16:12:45 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Nov 23 16:12:45 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000015 to be held by another RGW process; skipping for now
Nov 23 16:12:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:46.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:12:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:12:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:12:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:12:47 np0005532763 nova_compute[231311]: 2025-11-23 21:12:47.107 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:47.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:47 np0005532763 nova_compute[231311]: 2025-11-23 21:12:47.698 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:48 np0005532763 nova_compute[231311]: 2025-11-23 21:12:48.279 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:12:48 np0005532763 nova_compute[231311]: 2025-11-23 21:12:48.298 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Triggering sync for uuid 06b47618-b4a9-4de4-94aa-b97241ff094c _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 23 16:12:48 np0005532763 nova_compute[231311]: 2025-11-23 21:12:48.298 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "06b47618-b4a9-4de4-94aa-b97241ff094c" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:12:48 np0005532763 nova_compute[231311]: 2025-11-23 21:12:48.299 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:12:48 np0005532763 nova_compute[231311]: 2025-11-23 21:12:48.317 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:12:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:48.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:49.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:49 np0005532763 podman[240699]: 2025-11-23 21:12:49.216194543 +0000 UTC m=+0.090278544 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 23 16:12:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:12:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:12:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:50.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:12:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:51.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:51 np0005532763 ovn_controller[133425]: 2025-11-23T21:12:51Z|00053|memory_trim|INFO|Detected inactivity (last active 30015 ms ago): trimming memory
Nov 23 16:12:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:12:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:12:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:12:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:12:52 np0005532763 nova_compute[231311]: 2025-11-23 21:12:52.110 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:12:52.226 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:12:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:12:52.227 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:12:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:12:52.228 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:12:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:52.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:52 np0005532763 nova_compute[231311]: 2025-11-23 21:12:52.701 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:53.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:54.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:54 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:12:54 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:12:54 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:12:54 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:12:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:12:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:55.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:56.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:12:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:12:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:12:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:12:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:12:57 np0005532763 nova_compute[231311]: 2025-11-23 21:12:57.143 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:57.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:57 np0005532763 nova_compute[231311]: 2025-11-23 21:12:57.704 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:12:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:58 np0005532763 podman[240808]: 2025-11-23 21:12:58.307871879 +0000 UTC m=+0.179637174 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 16:12:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:12:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:12:58.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:12:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:12:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:12:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:12:59.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:12:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:12:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:12:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:12:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:12:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:00 np0005532763 podman[240836]: 2025-11-23 21:13:00.232990655 +0000 UTC m=+0.106260260 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 23 16:13:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:00.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:00 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:13:00 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:13:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:01.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:13:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:13:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:13:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:13:02 np0005532763 nova_compute[231311]: 2025-11-23 21:13:02.146 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:02.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:02 np0005532763 nova_compute[231311]: 2025-11-23 21:13:02.747 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:03.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:04.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:04 np0005532763 nova_compute[231311]: 2025-11-23 21:13:04.407 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:13:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:13:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:05.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:05 np0005532763 nova_compute[231311]: 2025-11-23 21:13:05.387 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:13:05 np0005532763 nova_compute[231311]: 2025-11-23 21:13:05.387 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:13:05 np0005532763 nova_compute[231311]: 2025-11-23 21:13:05.387 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:13:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:05 np0005532763 nova_compute[231311]: 2025-11-23 21:13:05.987 231315 DEBUG oslo_concurrency.lockutils [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "06b47618-b4a9-4de4-94aa-b97241ff094c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:13:05 np0005532763 nova_compute[231311]: 2025-11-23 21:13:05.988 231315 DEBUG oslo_concurrency.lockutils [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:13:05 np0005532763 nova_compute[231311]: 2025-11-23 21:13:05.988 231315 DEBUG oslo_concurrency.lockutils [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:13:05 np0005532763 nova_compute[231311]: 2025-11-23 21:13:05.988 231315 DEBUG oslo_concurrency.lockutils [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:13:05 np0005532763 nova_compute[231311]: 2025-11-23 21:13:05.989 231315 DEBUG oslo_concurrency.lockutils [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:13:05 np0005532763 nova_compute[231311]: 2025-11-23 21:13:05.991 231315 INFO nova.compute.manager [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Terminating instance#033[00m
Nov 23 16:13:05 np0005532763 nova_compute[231311]: 2025-11-23 21:13:05.992 231315 DEBUG nova.compute.manager [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:05.999 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "refresh_cache-06b47618-b4a9-4de4-94aa-b97241ff094c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.000 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquired lock "refresh_cache-06b47618-b4a9-4de4-94aa-b97241ff094c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.001 231315 DEBUG nova.network.neutron [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.001 231315 DEBUG nova.objects.instance [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 06b47618-b4a9-4de4-94aa-b97241ff094c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:13:06 np0005532763 kernel: tap6840e90e-b1 (unregistering): left promiscuous mode
Nov 23 16:13:06 np0005532763 NetworkManager[48849]: <info>  [1763932386.0573] device (tap6840e90e-b1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 23 16:13:06 np0005532763 ovn_controller[133425]: 2025-11-23T21:13:06Z|00054|binding|INFO|Releasing lport 6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 from this chassis (sb_readonly=0)
Nov 23 16:13:06 np0005532763 ovn_controller[133425]: 2025-11-23T21:13:06Z|00055|binding|INFO|Setting lport 6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 down in Southbound
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.069 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:06 np0005532763 ovn_controller[133425]: 2025-11-23T21:13:06Z|00056|binding|INFO|Removing iface tap6840e90e-b1 ovn-installed in OVS
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.072 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:06.080 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:3f:80 10.100.0.25'], port_security=['fa:16:3e:f8:3f:80 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '06b47618-b4a9-4de4-94aa-b97241ff094c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a53cafa8-a74e-467c-9117-a31bd6c650ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782593db60784ab8bff41fe87d72ff5f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '10befa5b-28b9-4956-82b3-968d8cb6ea4e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c22c132b-3565-4344-9558-f1d93c19cb57, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], logical_port=6840e90e-b1cd-46fd-b6c2-5a573ed8dde5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:13:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:06.082 142920 INFO neutron.agent.ovn.metadata.agent [-] Port 6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 in datapath a53cafa8-a74e-467c-9117-a31bd6c650ae unbound from our chassis#033[00m
Nov 23 16:13:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:06.084 142920 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a53cafa8-a74e-467c-9117-a31bd6c650ae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 23 16:13:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:06.086 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[9ef11218-1605-444d-beb7-c4c886d3c4d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:06.087 142920 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae namespace which is not needed anymore#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.110 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:06 np0005532763 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 23 16:13:06 np0005532763 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Consumed 15.943s CPU time.
Nov 23 16:13:06 np0005532763 systemd-machined[194484]: Machine qemu-4-instance-00000007 terminated.
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.227 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.233 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.238 231315 INFO nova.virt.libvirt.driver [-] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Instance destroyed successfully.#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.239 231315 DEBUG nova.objects.instance [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'resources' on Instance uuid 06b47618-b4a9-4de4-94aa-b97241ff094c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.253 231315 DEBUG nova.virt.libvirt.vif [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-23T21:11:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1151402468',display_name='tempest-TestNetworkBasicOps-server-1151402468',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1151402468',id=7,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOd9V/Qc+jZYb7RloX+DMFZ2y2px5S2r592+OMqtBAZfMl2Em9uz+jMlWKxAJG012CbASA7fOCn/CjEGt52Mes1DFgmLKPdgn3aP8AXdHCiAl9UyywwteZEGBeiwfxRB2Q==',key_name='tempest-TestNetworkBasicOps-166388418',keypairs=<?>,launch_index=0,launched_at=2025-11-23T21:11:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-axnp2ql9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-23T21:11:59Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=06b47618-b4a9-4de4-94aa-b97241ff094c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "address": "fa:16:3e:f8:3f:80", "network": {"id": "a53cafa8-a74e-467c-9117-a31bd6c650ae", "bridge": "br-int", "label": "tempest-network-smoke--511994107", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6840e90e-b1", "ovs_interfaceid": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.254 231315 DEBUG nova.network.os_vif_util [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "address": "fa:16:3e:f8:3f:80", "network": {"id": "a53cafa8-a74e-467c-9117-a31bd6c650ae", "bridge": "br-int", "label": "tempest-network-smoke--511994107", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6840e90e-b1", "ovs_interfaceid": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.255 231315 DEBUG nova.network.os_vif_util [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f8:3f:80,bridge_name='br-int',has_traffic_filtering=True,id=6840e90e-b1cd-46fd-b6c2-5a573ed8dde5,network=Network(a53cafa8-a74e-467c-9117-a31bd6c650ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6840e90e-b1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.256 231315 DEBUG os_vif [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:3f:80,bridge_name='br-int',has_traffic_filtering=True,id=6840e90e-b1cd-46fd-b6c2-5a573ed8dde5,network=Network(a53cafa8-a74e-467c-9117-a31bd6c650ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6840e90e-b1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.260 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.260 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6840e90e-b1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.262 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.264 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.268 231315 INFO os_vif [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:3f:80,bridge_name='br-int',has_traffic_filtering=True,id=6840e90e-b1cd-46fd-b6c2-5a573ed8dde5,network=Network(a53cafa8-a74e-467c-9117-a31bd6c650ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6840e90e-b1')#033[00m
Nov 23 16:13:06 np0005532763 neutron-haproxy-ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae[240447]: [NOTICE]   (240451) : haproxy version is 2.8.14-c23fe91
Nov 23 16:13:06 np0005532763 neutron-haproxy-ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae[240447]: [NOTICE]   (240451) : path to executable is /usr/sbin/haproxy
Nov 23 16:13:06 np0005532763 neutron-haproxy-ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae[240447]: [WARNING]  (240451) : Exiting Master process...
Nov 23 16:13:06 np0005532763 neutron-haproxy-ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae[240447]: [ALERT]    (240451) : Current worker (240453) exited with code 143 (Terminated)
Nov 23 16:13:06 np0005532763 neutron-haproxy-ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae[240447]: [WARNING]  (240451) : All workers exited. Exiting... (0)
Nov 23 16:13:06 np0005532763 systemd[1]: libpod-45d9ea17230c20f5f8ea4e3db155da147a7efcd643378fa7c684400258a8f804.scope: Deactivated successfully.
Nov 23 16:13:06 np0005532763 podman[240939]: 2025-11-23 21:13:06.302812845 +0000 UTC m=+0.074880016 container died 45d9ea17230c20f5f8ea4e3db155da147a7efcd643378fa7c684400258a8f804 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 16:13:06 np0005532763 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-45d9ea17230c20f5f8ea4e3db155da147a7efcd643378fa7c684400258a8f804-userdata-shm.mount: Deactivated successfully.
Nov 23 16:13:06 np0005532763 systemd[1]: var-lib-containers-storage-overlay-18ec784255dcd505044fd242be7ae66fc3d81ffffee187b36f235bfc4bea4f39-merged.mount: Deactivated successfully.
Nov 23 16:13:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:06.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:06 np0005532763 podman[240939]: 2025-11-23 21:13:06.35938051 +0000 UTC m=+0.131447711 container cleanup 45d9ea17230c20f5f8ea4e3db155da147a7efcd643378fa7c684400258a8f804 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.377 231315 DEBUG nova.compute.manager [req-eecf0016-2f83-46e8-9077-c3f6393c3c3a req-a238497f-3b21-4e4d-a589-bbe623abc6c2 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Received event network-vif-unplugged-6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.377 231315 DEBUG oslo_concurrency.lockutils [req-eecf0016-2f83-46e8-9077-c3f6393c3c3a req-a238497f-3b21-4e4d-a589-bbe623abc6c2 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.378 231315 DEBUG oslo_concurrency.lockutils [req-eecf0016-2f83-46e8-9077-c3f6393c3c3a req-a238497f-3b21-4e4d-a589-bbe623abc6c2 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.378 231315 DEBUG oslo_concurrency.lockutils [req-eecf0016-2f83-46e8-9077-c3f6393c3c3a req-a238497f-3b21-4e4d-a589-bbe623abc6c2 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.379 231315 DEBUG nova.compute.manager [req-eecf0016-2f83-46e8-9077-c3f6393c3c3a req-a238497f-3b21-4e4d-a589-bbe623abc6c2 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] No waiting events found dispatching network-vif-unplugged-6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.380 231315 DEBUG nova.compute.manager [req-eecf0016-2f83-46e8-9077-c3f6393c3c3a req-a238497f-3b21-4e4d-a589-bbe623abc6c2 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Received event network-vif-unplugged-6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 23 16:13:06 np0005532763 systemd[1]: libpod-conmon-45d9ea17230c20f5f8ea4e3db155da147a7efcd643378fa7c684400258a8f804.scope: Deactivated successfully.
Nov 23 16:13:06 np0005532763 podman[240996]: 2025-11-23 21:13:06.459908929 +0000 UTC m=+0.057582894 container remove 45d9ea17230c20f5f8ea4e3db155da147a7efcd643378fa7c684400258a8f804 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae, org.label-schema.build-date=20251118, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 16:13:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:06.470 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[1f921cd3-d497-43ac-af38-f4bb023e1992]: (4, ('Sun Nov 23 09:13:06 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae (45d9ea17230c20f5f8ea4e3db155da147a7efcd643378fa7c684400258a8f804)\n45d9ea17230c20f5f8ea4e3db155da147a7efcd643378fa7c684400258a8f804\nSun Nov 23 09:13:06 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae (45d9ea17230c20f5f8ea4e3db155da147a7efcd643378fa7c684400258a8f804)\n45d9ea17230c20f5f8ea4e3db155da147a7efcd643378fa7c684400258a8f804\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:06.473 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[c59cc92e-d057-4666-826d-e9b5bee63ce5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:06.474 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa53cafa8-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.477 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:06 np0005532763 kernel: tapa53cafa8-a0: left promiscuous mode
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.481 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:06.485 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[e4a25e7a-83b9-4606-bd4c-37b7c7996b73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:06.490 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3a:26:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:d5:4d:db:d5:2b'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.504 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:06.512 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[c0641679-6274-4ab0-a79b-cdae908d6566]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:06.514 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[e06655b1-1e1f-4559-85cc-b3f9621f5360]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:06.536 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[d10abe4b-fa78-4702-a7f9-94f715f6f145]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 426854, 'reachable_time': 23608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241011, 'error': None, 'target': 'ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:06.541 143034 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a53cafa8-a74e-467c-9117-a31bd6c650ae deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 23 16:13:06 np0005532763 systemd[1]: run-netns-ovnmeta\x2da53cafa8\x2da74e\x2d467c\x2d9117\x2da31bd6c650ae.mount: Deactivated successfully.
Nov 23 16:13:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:06.542 143034 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea6d231-d19f-4062-8add-6d8e6e1abad7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:06.545 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.752 231315 INFO nova.virt.libvirt.driver [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Deleting instance files /var/lib/nova/instances/06b47618-b4a9-4de4-94aa-b97241ff094c_del#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.754 231315 INFO nova.virt.libvirt.driver [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Deletion of /var/lib/nova/instances/06b47618-b4a9-4de4-94aa-b97241ff094c_del complete#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.806 231315 INFO nova.compute.manager [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Took 0.81 seconds to destroy the instance on the hypervisor.#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.806 231315 DEBUG oslo.service.loopingcall [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.807 231315 DEBUG nova.compute.manager [-] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 23 16:13:06 np0005532763 nova_compute[231311]: 2025-11-23 21:13:06.807 231315 DEBUG nova.network.neutron [-] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 23 16:13:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:13:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:13:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:13:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:13:07 np0005532763 nova_compute[231311]: 2025-11-23 21:13:07.198 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:13:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:07.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:13:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:07 np0005532763 nova_compute[231311]: 2025-11-23 21:13:07.577 231315 DEBUG nova.network.neutron [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Updating instance_info_cache with network_info: [{"id": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "address": "fa:16:3e:f8:3f:80", "network": {"id": "a53cafa8-a74e-467c-9117-a31bd6c650ae", "bridge": "br-int", "label": "tempest-network-smoke--511994107", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6840e90e-b1", "ovs_interfaceid": "6840e90e-b1cd-46fd-b6c2-5a573ed8dde5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:13:07 np0005532763 nova_compute[231311]: 2025-11-23 21:13:07.591 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Releasing lock "refresh_cache-06b47618-b4a9-4de4-94aa-b97241ff094c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:13:07 np0005532763 nova_compute[231311]: 2025-11-23 21:13:07.592 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 23 16:13:07 np0005532763 nova_compute[231311]: 2025-11-23 21:13:07.593 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:13:07 np0005532763 nova_compute[231311]: 2025-11-23 21:13:07.593 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:13:07 np0005532763 nova_compute[231311]: 2025-11-23 21:13:07.760 231315 DEBUG nova.network.neutron [-] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:13:07 np0005532763 nova_compute[231311]: 2025-11-23 21:13:07.775 231315 INFO nova.compute.manager [-] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Took 0.97 seconds to deallocate network for instance.#033[00m
Nov 23 16:13:07 np0005532763 nova_compute[231311]: 2025-11-23 21:13:07.817 231315 DEBUG oslo_concurrency.lockutils [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:13:07 np0005532763 nova_compute[231311]: 2025-11-23 21:13:07.818 231315 DEBUG oslo_concurrency.lockutils [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:13:07 np0005532763 nova_compute[231311]: 2025-11-23 21:13:07.850 231315 DEBUG nova.scheduler.client.report [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Refreshing inventories for resource provider 20c32e0a-de2c-427c-9273-fac11e2660f4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 23 16:13:07 np0005532763 nova_compute[231311]: 2025-11-23 21:13:07.885 231315 DEBUG nova.scheduler.client.report [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Updating ProviderTree inventory for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 23 16:13:07 np0005532763 nova_compute[231311]: 2025-11-23 21:13:07.885 231315 DEBUG nova.compute.provider_tree [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Updating inventory in ProviderTree for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 23 16:13:07 np0005532763 nova_compute[231311]: 2025-11-23 21:13:07.906 231315 DEBUG nova.scheduler.client.report [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Refreshing aggregate associations for resource provider 20c32e0a-de2c-427c-9273-fac11e2660f4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 23 16:13:07 np0005532763 nova_compute[231311]: 2025-11-23 21:13:07.931 231315 DEBUG nova.scheduler.client.report [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Refreshing trait associations for resource provider 20c32e0a-de2c-427c-9273-fac11e2660f4, traits: COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AVX2,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_FMA3,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,HW_CPU_X86_SVM,HW_CPU_X86_BMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 23 16:13:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:07 np0005532763 nova_compute[231311]: 2025-11-23 21:13:07.965 231315 DEBUG oslo_concurrency.processutils [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:13:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:08.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.403 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:13:08 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:13:08 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/476045377' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.493 231315 DEBUG oslo_concurrency.processutils [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.499 231315 DEBUG nova.compute.provider_tree [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.520 231315 DEBUG nova.scheduler.client.report [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.529 231315 DEBUG nova.compute.manager [req-b5bde450-8871-479b-9894-a8b43943998c req-3290633f-1d6d-46a9-bd84-1ce15d6d61eb 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Received event network-vif-plugged-6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.529 231315 DEBUG oslo_concurrency.lockutils [req-b5bde450-8871-479b-9894-a8b43943998c req-3290633f-1d6d-46a9-bd84-1ce15d6d61eb 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.530 231315 DEBUG oslo_concurrency.lockutils [req-b5bde450-8871-479b-9894-a8b43943998c req-3290633f-1d6d-46a9-bd84-1ce15d6d61eb 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.530 231315 DEBUG oslo_concurrency.lockutils [req-b5bde450-8871-479b-9894-a8b43943998c req-3290633f-1d6d-46a9-bd84-1ce15d6d61eb 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.531 231315 DEBUG nova.compute.manager [req-b5bde450-8871-479b-9894-a8b43943998c req-3290633f-1d6d-46a9-bd84-1ce15d6d61eb 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] No waiting events found dispatching network-vif-plugged-6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.531 231315 WARNING nova.compute.manager [req-b5bde450-8871-479b-9894-a8b43943998c req-3290633f-1d6d-46a9-bd84-1ce15d6d61eb 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Received unexpected event network-vif-plugged-6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 for instance with vm_state deleted and task_state None.#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.532 231315 DEBUG nova.compute.manager [req-b5bde450-8871-479b-9894-a8b43943998c req-3290633f-1d6d-46a9-bd84-1ce15d6d61eb 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Received event network-vif-deleted-6840e90e-b1cd-46fd-b6c2-5a573ed8dde5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.546 231315 DEBUG oslo_concurrency.lockutils [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.728s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:13:08 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:08.547 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10e3bf57-dd2d-4b94-851f-925bcd297dde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.550 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.551 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.551 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.552 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.612 231315 INFO nova.scheduler.client.report [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Deleted allocations for instance 06b47618-b4a9-4de4-94aa-b97241ff094c#033[00m
Nov 23 16:13:08 np0005532763 nova_compute[231311]: 2025-11-23 21:13:08.676 231315 DEBUG oslo_concurrency.lockutils [None req-145b758d-9f9b-47fb-9f63-b87e8b02bb9d 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "06b47618-b4a9-4de4-94aa-b97241ff094c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.689s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:13:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:13:09 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1740834039' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:13:09 np0005532763 nova_compute[231311]: 2025-11-23 21:13:09.100 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.549s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:13:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:09.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:09 np0005532763 nova_compute[231311]: 2025-11-23 21:13:09.364 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:13:09 np0005532763 nova_compute[231311]: 2025-11-23 21:13:09.366 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4880MB free_disk=59.89692687988281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:13:09 np0005532763 nova_compute[231311]: 2025-11-23 21:13:09.366 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:13:09 np0005532763 nova_compute[231311]: 2025-11-23 21:13:09.367 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:13:09 np0005532763 nova_compute[231311]: 2025-11-23 21:13:09.416 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:13:09 np0005532763 nova_compute[231311]: 2025-11-23 21:13:09.416 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:13:09 np0005532763 nova_compute[231311]: 2025-11-23 21:13:09.432 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:13:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:13:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:13:09 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3558533142' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:13:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:09 np0005532763 nova_compute[231311]: 2025-11-23 21:13:09.964 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.532s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:13:09 np0005532763 nova_compute[231311]: 2025-11-23 21:13:09.974 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:13:09 np0005532763 nova_compute[231311]: 2025-11-23 21:13:09.990 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:13:10 np0005532763 nova_compute[231311]: 2025-11-23 21:13:10.014 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:13:10 np0005532763 nova_compute[231311]: 2025-11-23 21:13:10.014 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.647s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:13:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:10.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:11 np0005532763 nova_compute[231311]: 2025-11-23 21:13:11.015 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:13:11 np0005532763 nova_compute[231311]: 2025-11-23 21:13:11.015 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:13:11 np0005532763 nova_compute[231311]: 2025-11-23 21:13:11.016 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:13:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:11.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:11 np0005532763 nova_compute[231311]: 2025-11-23 21:13:11.264 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:11 np0005532763 nova_compute[231311]: 2025-11-23 21:13:11.842 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:13:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:13:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:13:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:13:12 np0005532763 nova_compute[231311]: 2025-11-23 21:13:12.223 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:12.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:13.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:14.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:13:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:15.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:16 np0005532763 nova_compute[231311]: 2025-11-23 21:13:16.267 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:16.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:13:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:13:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:13:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:13:17 np0005532763 nova_compute[231311]: 2025-11-23 21:13:17.226 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:17.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:18.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:19.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:13:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:20 np0005532763 podman[241094]: 2025-11-23 21:13:20.213499016 +0000 UTC m=+0.089656497 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 16:13:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:20.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:21.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:21 np0005532763 nova_compute[231311]: 2025-11-23 21:13:21.236 231315 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763932386.2353332, 06b47618-b4a9-4de4-94aa-b97241ff094c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:13:21 np0005532763 nova_compute[231311]: 2025-11-23 21:13:21.237 231315 INFO nova.compute.manager [-] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] VM Stopped (Lifecycle Event)#033[00m
Nov 23 16:13:21 np0005532763 nova_compute[231311]: 2025-11-23 21:13:21.270 231315 DEBUG nova.compute.manager [None req-77957cc9-44d3-402a-96cd-448f5370f18b - - - - - -] [instance: 06b47618-b4a9-4de4-94aa-b97241ff094c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:13:21 np0005532763 nova_compute[231311]: 2025-11-23 21:13:21.271 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:13:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:13:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:13:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:13:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:22 np0005532763 nova_compute[231311]: 2025-11-23 21:13:22.285 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:13:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:22.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:13:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:23.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:24.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:13:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:25.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:26 np0005532763 nova_compute[231311]: 2025-11-23 21:13:26.274 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:26.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:13:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:13:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:13:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:13:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:27.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:27 np0005532763 nova_compute[231311]: 2025-11-23 21:13:27.289 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:28.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:29.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:29 np0005532763 podman[241148]: 2025-11-23 21:13:29.264021174 +0000 UTC m=+0.136230444 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 23 16:13:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:13:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:30.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:31 np0005532763 podman[241176]: 2025-11-23 21:13:31.223295244 +0000 UTC m=+0.096734405 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 16:13:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:31.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:31 np0005532763 nova_compute[231311]: 2025-11-23 21:13:31.277 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:13:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:13:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:13:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:13:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:32 np0005532763 nova_compute[231311]: 2025-11-23 21:13:32.331 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:32.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:33.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:34.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:13:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:35.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:36 np0005532763 nova_compute[231311]: 2025-11-23 21:13:36.281 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:36.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:13:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:13:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:13:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:13:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:37.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:37 np0005532763 nova_compute[231311]: 2025-11-23 21:13:37.334 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:38.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:39.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:13:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:40.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:13:40.613792) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932420613863, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1768, "num_deletes": 257, "total_data_size": 4533470, "memory_usage": 4624144, "flush_reason": "Manual Compaction"}
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932420631075, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 2940573, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28573, "largest_seqno": 30336, "table_properties": {"data_size": 2933288, "index_size": 4228, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15040, "raw_average_key_size": 19, "raw_value_size": 2918711, "raw_average_value_size": 3790, "num_data_blocks": 186, "num_entries": 770, "num_filter_entries": 770, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763932273, "oldest_key_time": 1763932273, "file_creation_time": 1763932420, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 17350 microseconds, and 11706 cpu microseconds.
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:13:40.631142) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 2940573 bytes OK
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:13:40.631175) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:13:40.632920) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:13:40.632945) EVENT_LOG_v1 {"time_micros": 1763932420632937, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:13:40.632975) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 4525481, prev total WAL file size 4525481, number of live WAL files 2.
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:13:40.635004) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353031' seq:72057594037927935, type:22 .. '6C6F676D00373534' seq:0, type:0; will stop at (end)
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(2871KB)], [54(14MB)]
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932420635071, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 17868923, "oldest_snapshot_seqno": -1}
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 6084 keys, 17720326 bytes, temperature: kUnknown
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932420731153, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 17720326, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17675937, "index_size": 28087, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15237, "raw_key_size": 154785, "raw_average_key_size": 25, "raw_value_size": 17562769, "raw_average_value_size": 2886, "num_data_blocks": 1153, "num_entries": 6084, "num_filter_entries": 6084, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763932420, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:13:40.731571) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 17720326 bytes
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:13:40.733309) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 185.7 rd, 184.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.8, 14.2 +0.0 blob) out(16.9 +0.0 blob), read-write-amplify(12.1) write-amplify(6.0) OK, records in: 6616, records dropped: 532 output_compression: NoCompression
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:13:40.733341) EVENT_LOG_v1 {"time_micros": 1763932420733326, "job": 32, "event": "compaction_finished", "compaction_time_micros": 96230, "compaction_time_cpu_micros": 68652, "output_level": 6, "num_output_files": 1, "total_output_size": 17720326, "num_input_records": 6616, "num_output_records": 6084, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932420734644, "job": 32, "event": "table_file_deletion", "file_number": 56}
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932420739857, "job": 32, "event": "table_file_deletion", "file_number": 54}
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:13:40.634873) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:13:40.739909) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:13:40.739916) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:13:40.739919) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:13:40.739922) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:13:40 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:13:40.739925) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:13:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:41.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:41 np0005532763 nova_compute[231311]: 2025-11-23 21:13:41.285 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:13:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:13:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:13:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:13:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:42 np0005532763 nova_compute[231311]: 2025-11-23 21:13:42.356 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:42.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:43.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:44.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:13:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:45.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:46 np0005532763 nova_compute[231311]: 2025-11-23 21:13:46.289 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.003000084s ======
Nov 23 16:13:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:46.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000084s
Nov 23 16:13:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:13:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:13:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:13:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:13:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:47.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:47 np0005532763 nova_compute[231311]: 2025-11-23 21:13:47.402 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:48.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:49.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:13:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:50.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:51 np0005532763 podman[241241]: 2025-11-23 21:13:51.195844289 +0000 UTC m=+0.077267123 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 23 16:13:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:51.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:51 np0005532763 nova_compute[231311]: 2025-11-23 21:13:51.292 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:51 np0005532763 nova_compute[231311]: 2025-11-23 21:13:51.974 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "c5f71bc7-14b7-4aae-992b-71709e979f38" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:13:51 np0005532763 nova_compute[231311]: 2025-11-23 21:13:51.975 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "c5f71bc7-14b7-4aae-992b-71709e979f38" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:13:51 np0005532763 nova_compute[231311]: 2025-11-23 21:13:51.989 231315 DEBUG nova.compute.manager [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 23 16:13:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:13:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:13:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:13:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.061 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.061 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.070 231315 DEBUG nova.virt.hardware [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.071 231315 INFO nova.compute.claims [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.176 231315 DEBUG oslo_concurrency.processutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:13:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:52.227 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:13:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:52.228 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:13:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:52.228 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:13:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.405 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:52.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:52 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:13:52 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3665490940' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.734 231315 DEBUG oslo_concurrency.processutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.558s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.740 231315 DEBUG nova.compute.provider_tree [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.751 231315 DEBUG nova.scheduler.client.report [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.769 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.708s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.770 231315 DEBUG nova.compute.manager [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.810 231315 DEBUG nova.compute.manager [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.811 231315 DEBUG nova.network.neutron [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.826 231315 INFO nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.847 231315 DEBUG nova.compute.manager [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 23 16:13:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.967 231315 DEBUG nova.compute.manager [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.969 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 23 16:13:52 np0005532763 nova_compute[231311]: 2025-11-23 21:13:52.970 231315 INFO nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Creating image(s)#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.014 231315 DEBUG nova.storage.rbd_utils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image c5f71bc7-14b7-4aae-992b-71709e979f38_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.057 231315 DEBUG nova.storage.rbd_utils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image c5f71bc7-14b7-4aae-992b-71709e979f38_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.099 231315 DEBUG nova.storage.rbd_utils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image c5f71bc7-14b7-4aae-992b-71709e979f38_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.105 231315 DEBUG oslo_concurrency.processutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.138 231315 DEBUG nova.policy [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9fb5352c62684f2ba3a326a953a10dfe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '782593db60784ab8bff41fe87d72ff5f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.193 231315 DEBUG oslo_concurrency.processutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.194 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.196 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.196 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.237 231315 DEBUG nova.storage.rbd_utils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image c5f71bc7-14b7-4aae-992b-71709e979f38_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:13:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.243 231315 DEBUG oslo_concurrency.processutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 c5f71bc7-14b7-4aae-992b-71709e979f38_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:13:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:53.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.622 231315 DEBUG oslo_concurrency.processutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 c5f71bc7-14b7-4aae-992b-71709e979f38_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.379s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.740 231315 DEBUG nova.storage.rbd_utils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] resizing rbd image c5f71bc7-14b7-4aae-992b-71709e979f38_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.886 231315 DEBUG nova.objects.instance [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'migration_context' on Instance uuid c5f71bc7-14b7-4aae-992b-71709e979f38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.898 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.898 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Ensure instance console log exists: /var/lib/nova/instances/c5f71bc7-14b7-4aae-992b-71709e979f38/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.898 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.899 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:13:53 np0005532763 nova_compute[231311]: 2025-11-23 21:13:53.899 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:13:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:54 np0005532763 nova_compute[231311]: 2025-11-23 21:13:54.206 231315 DEBUG nova.network.neutron [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Successfully updated port: ba818b19-9f72-4242-b9d9-b1630b5d1f24 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 23 16:13:54 np0005532763 nova_compute[231311]: 2025-11-23 21:13:54.223 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "refresh_cache-c5f71bc7-14b7-4aae-992b-71709e979f38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:13:54 np0005532763 nova_compute[231311]: 2025-11-23 21:13:54.224 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquired lock "refresh_cache-c5f71bc7-14b7-4aae-992b-71709e979f38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:13:54 np0005532763 nova_compute[231311]: 2025-11-23 21:13:54.224 231315 DEBUG nova.network.neutron [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 23 16:13:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:54 np0005532763 nova_compute[231311]: 2025-11-23 21:13:54.350 231315 DEBUG nova.compute.manager [req-a422c8cd-0b8a-4ca5-a61e-899b92b8c907 req-750086d2-81d0-4def-9405-152ab077c2f8 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Received event network-changed-ba818b19-9f72-4242-b9d9-b1630b5d1f24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:13:54 np0005532763 nova_compute[231311]: 2025-11-23 21:13:54.351 231315 DEBUG nova.compute.manager [req-a422c8cd-0b8a-4ca5-a61e-899b92b8c907 req-750086d2-81d0-4def-9405-152ab077c2f8 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Refreshing instance network info cache due to event network-changed-ba818b19-9f72-4242-b9d9-b1630b5d1f24. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 23 16:13:54 np0005532763 nova_compute[231311]: 2025-11-23 21:13:54.351 231315 DEBUG oslo_concurrency.lockutils [req-a422c8cd-0b8a-4ca5-a61e-899b92b8c907 req-750086d2-81d0-4def-9405-152ab077c2f8 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "refresh_cache-c5f71bc7-14b7-4aae-992b-71709e979f38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:13:54 np0005532763 nova_compute[231311]: 2025-11-23 21:13:54.361 231315 DEBUG nova.network.neutron [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 23 16:13:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:54.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:13:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:55.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:55 np0005532763 ovn_controller[133425]: 2025-11-23T21:13:55Z|00057|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.539 231315 DEBUG nova.network.neutron [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Updating instance_info_cache with network_info: [{"id": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "address": "fa:16:3e:0d:e6:fe", "network": {"id": "fd64d126-bc30-4f96-8737-9a4b1cf2fe8a", "bridge": "br-int", "label": "tempest-network-smoke--1300883220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapba818b19-9f", "ovs_interfaceid": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.557 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Releasing lock "refresh_cache-c5f71bc7-14b7-4aae-992b-71709e979f38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.558 231315 DEBUG nova.compute.manager [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Instance network_info: |[{"id": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "address": "fa:16:3e:0d:e6:fe", "network": {"id": "fd64d126-bc30-4f96-8737-9a4b1cf2fe8a", "bridge": "br-int", "label": "tempest-network-smoke--1300883220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapba818b19-9f", "ovs_interfaceid": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.558 231315 DEBUG oslo_concurrency.lockutils [req-a422c8cd-0b8a-4ca5-a61e-899b92b8c907 req-750086d2-81d0-4def-9405-152ab077c2f8 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquired lock "refresh_cache-c5f71bc7-14b7-4aae-992b-71709e979f38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.558 231315 DEBUG nova.network.neutron [req-a422c8cd-0b8a-4ca5-a61e-899b92b8c907 req-750086d2-81d0-4def-9405-152ab077c2f8 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Refreshing network info cache for port ba818b19-9f72-4242-b9d9-b1630b5d1f24 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.562 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Start _get_guest_xml network_info=[{"id": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "address": "fa:16:3e:0d:e6:fe", "network": {"id": "fd64d126-bc30-4f96-8737-9a4b1cf2fe8a", "bridge": "br-int", "label": "tempest-network-smoke--1300883220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapba818b19-9f", "ovs_interfaceid": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-23T21:05:43Z,direct_url=<?>,disk_format='qcow2',id=3c45fa6c-8a99-4359-a34e-d89f4e1e77d0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3f8fb5175f85402ba20cf9c6989d47cf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-23T21:05:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'encryption_options': None, 'size': 0, 'image_id': '3c45fa6c-8a99-4359-a34e-d89f4e1e77d0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.568 231315 WARNING nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.574 231315 DEBUG nova.virt.libvirt.host [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.575 231315 DEBUG nova.virt.libvirt.host [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.582 231315 DEBUG nova.virt.libvirt.host [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.582 231315 DEBUG nova.virt.libvirt.host [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.583 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.583 231315 DEBUG nova.virt.hardware [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-23T21:05:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='56044b93-2979-48aa-b67f-c37e1b489306',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-23T21:05:43Z,direct_url=<?>,disk_format='qcow2',id=3c45fa6c-8a99-4359-a34e-d89f4e1e77d0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3f8fb5175f85402ba20cf9c6989d47cf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-23T21:05:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.584 231315 DEBUG nova.virt.hardware [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.584 231315 DEBUG nova.virt.hardware [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.584 231315 DEBUG nova.virt.hardware [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.584 231315 DEBUG nova.virt.hardware [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.584 231315 DEBUG nova.virt.hardware [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.585 231315 DEBUG nova.virt.hardware [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.585 231315 DEBUG nova.virt.hardware [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.585 231315 DEBUG nova.virt.hardware [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.585 231315 DEBUG nova.virt.hardware [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.585 231315 DEBUG nova.virt.hardware [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 23 16:13:55 np0005532763 nova_compute[231311]: 2025-11-23 21:13:55.591 231315 DEBUG oslo_concurrency.processutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:13:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:56 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 16:13:56 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1598720131' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.047 231315 DEBUG oslo_concurrency.processutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.086 231315 DEBUG nova.storage.rbd_utils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image c5f71bc7-14b7-4aae-992b-71709e979f38_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.092 231315 DEBUG oslo_concurrency.processutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:13:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.294 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:13:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:56.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:13:56 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 16:13:56 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1065404581' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.578 231315 DEBUG oslo_concurrency.processutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.580 231315 DEBUG nova.virt.libvirt.vif [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T21:13:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-133268455',display_name='tempest-TestNetworkBasicOps-server-133268455',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-133268455',id=9,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN9XcQAIBv32AOnQ/uBrGNwBZurj0yvrQVcMrvkyK3ctTPLKsvruhaMgHV7K2cU5FTazPRilFrIpMab0kt1oWc9vuxyQVHP2UQmIJWRSBUBYUtVpyBjHjDA9PZXQnhSgeQ==',key_name='tempest-TestNetworkBasicOps-486410375',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-qmtdh5xg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-23T21:13:52Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=c5f71bc7-14b7-4aae-992b-71709e979f38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "address": "fa:16:3e:0d:e6:fe", "network": {"id": "fd64d126-bc30-4f96-8737-9a4b1cf2fe8a", "bridge": "br-int", "label": "tempest-network-smoke--1300883220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapba818b19-9f", "ovs_interfaceid": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.580 231315 DEBUG nova.network.os_vif_util [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "address": "fa:16:3e:0d:e6:fe", "network": {"id": "fd64d126-bc30-4f96-8737-9a4b1cf2fe8a", "bridge": "br-int", "label": "tempest-network-smoke--1300883220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapba818b19-9f", "ovs_interfaceid": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.581 231315 DEBUG nova.network.os_vif_util [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:e6:fe,bridge_name='br-int',has_traffic_filtering=True,id=ba818b19-9f72-4242-b9d9-b1630b5d1f24,network=Network(fd64d126-bc30-4f96-8737-9a4b1cf2fe8a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapba818b19-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.582 231315 DEBUG nova.objects.instance [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'pci_devices' on Instance uuid c5f71bc7-14b7-4aae-992b-71709e979f38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.608 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] End _get_guest_xml xml=<domain type="kvm">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  <uuid>c5f71bc7-14b7-4aae-992b-71709e979f38</uuid>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  <name>instance-00000009</name>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  <memory>131072</memory>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  <vcpu>1</vcpu>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  <metadata>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <nova:name>tempest-TestNetworkBasicOps-server-133268455</nova:name>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <nova:creationTime>2025-11-23 21:13:55</nova:creationTime>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <nova:flavor name="m1.nano">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        <nova:memory>128</nova:memory>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        <nova:disk>1</nova:disk>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        <nova:swap>0</nova:swap>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        <nova:ephemeral>0</nova:ephemeral>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        <nova:vcpus>1</nova:vcpus>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      </nova:flavor>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <nova:owner>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        <nova:user uuid="9fb5352c62684f2ba3a326a953a10dfe">tempest-TestNetworkBasicOps-1975357669-project-member</nova:user>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        <nova:project uuid="782593db60784ab8bff41fe87d72ff5f">tempest-TestNetworkBasicOps-1975357669</nova:project>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      </nova:owner>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <nova:root type="image" uuid="3c45fa6c-8a99-4359-a34e-d89f4e1e77d0"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <nova:ports>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        <nova:port uuid="ba818b19-9f72-4242-b9d9-b1630b5d1f24">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        </nova:port>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      </nova:ports>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    </nova:instance>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  </metadata>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  <sysinfo type="smbios">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <system>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <entry name="manufacturer">RDO</entry>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <entry name="product">OpenStack Compute</entry>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <entry name="serial">c5f71bc7-14b7-4aae-992b-71709e979f38</entry>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <entry name="uuid">c5f71bc7-14b7-4aae-992b-71709e979f38</entry>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <entry name="family">Virtual Machine</entry>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    </system>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  </sysinfo>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  <os>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <boot dev="hd"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <smbios mode="sysinfo"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  </os>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  <features>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <acpi/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <apic/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <vmcoreinfo/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  </features>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  <clock offset="utc">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <timer name="pit" tickpolicy="delay"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <timer name="hpet" present="no"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  </clock>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  <cpu mode="host-model" match="exact">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <topology sockets="1" cores="1" threads="1"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  </cpu>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  <devices>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <disk type="network" device="disk">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <driver type="raw" cache="none"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <source protocol="rbd" name="vms/c5f71bc7-14b7-4aae-992b-71709e979f38_disk">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        <host name="192.168.122.100" port="6789"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        <host name="192.168.122.102" port="6789"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        <host name="192.168.122.101" port="6789"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      </source>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <auth username="openstack">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        <secret type="ceph" uuid="03808be8-ae4a-5548-82e6-4a294f1bc627"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      </auth>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <target dev="vda" bus="virtio"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    </disk>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <disk type="network" device="cdrom">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <driver type="raw" cache="none"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <source protocol="rbd" name="vms/c5f71bc7-14b7-4aae-992b-71709e979f38_disk.config">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        <host name="192.168.122.100" port="6789"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        <host name="192.168.122.102" port="6789"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        <host name="192.168.122.101" port="6789"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      </source>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <auth username="openstack">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:        <secret type="ceph" uuid="03808be8-ae4a-5548-82e6-4a294f1bc627"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      </auth>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <target dev="sda" bus="sata"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    </disk>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <interface type="ethernet">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <mac address="fa:16:3e:0d:e6:fe"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <model type="virtio"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <driver name="vhost" rx_queue_size="512"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <mtu size="1442"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <target dev="tapba818b19-9f"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    </interface>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <serial type="pty">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <log file="/var/lib/nova/instances/c5f71bc7-14b7-4aae-992b-71709e979f38/console.log" append="off"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    </serial>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <video>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <model type="virtio"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    </video>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <input type="tablet" bus="usb"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <rng model="virtio">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <backend model="random">/dev/urandom</backend>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    </rng>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <controller type="usb" index="0"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    <memballoon model="virtio">
Nov 23 16:13:56 np0005532763 nova_compute[231311]:      <stats period="10"/>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:    </memballoon>
Nov 23 16:13:56 np0005532763 nova_compute[231311]:  </devices>
Nov 23 16:13:56 np0005532763 nova_compute[231311]: </domain>
Nov 23 16:13:56 np0005532763 nova_compute[231311]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.609 231315 DEBUG nova.compute.manager [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Preparing to wait for external event network-vif-plugged-ba818b19-9f72-4242-b9d9-b1630b5d1f24 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.610 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.611 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.611 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.612 231315 DEBUG nova.virt.libvirt.vif [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T21:13:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-133268455',display_name='tempest-TestNetworkBasicOps-server-133268455',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-133268455',id=9,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN9XcQAIBv32AOnQ/uBrGNwBZurj0yvrQVcMrvkyK3ctTPLKsvruhaMgHV7K2cU5FTazPRilFrIpMab0kt1oWc9vuxyQVHP2UQmIJWRSBUBYUtVpyBjHjDA9PZXQnhSgeQ==',key_name='tempest-TestNetworkBasicOps-486410375',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-qmtdh5xg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-23T21:13:52Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=c5f71bc7-14b7-4aae-992b-71709e979f38,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "address": "fa:16:3e:0d:e6:fe", "network": {"id": "fd64d126-bc30-4f96-8737-9a4b1cf2fe8a", "bridge": "br-int", "label": "tempest-network-smoke--1300883220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapba818b19-9f", "ovs_interfaceid": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.613 231315 DEBUG nova.network.os_vif_util [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "address": "fa:16:3e:0d:e6:fe", "network": {"id": "fd64d126-bc30-4f96-8737-9a4b1cf2fe8a", "bridge": "br-int", "label": "tempest-network-smoke--1300883220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapba818b19-9f", "ovs_interfaceid": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.614 231315 DEBUG nova.network.os_vif_util [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:e6:fe,bridge_name='br-int',has_traffic_filtering=True,id=ba818b19-9f72-4242-b9d9-b1630b5d1f24,network=Network(fd64d126-bc30-4f96-8737-9a4b1cf2fe8a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapba818b19-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.615 231315 DEBUG os_vif [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:e6:fe,bridge_name='br-int',has_traffic_filtering=True,id=ba818b19-9f72-4242-b9d9-b1630b5d1f24,network=Network(fd64d126-bc30-4f96-8737-9a4b1cf2fe8a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapba818b19-9f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.616 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.617 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.618 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.622 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.622 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapba818b19-9f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.623 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapba818b19-9f, col_values=(('external_ids', {'iface-id': 'ba818b19-9f72-4242-b9d9-b1630b5d1f24', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0d:e6:fe', 'vm-uuid': 'c5f71bc7-14b7-4aae-992b-71709e979f38'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.625 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:56 np0005532763 NetworkManager[48849]: <info>  [1763932436.6271] manager: (tapba818b19-9f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.630 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.634 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.634 231315 INFO os_vif [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:e6:fe,bridge_name='br-int',has_traffic_filtering=True,id=ba818b19-9f72-4242-b9d9-b1630b5d1f24,network=Network(fd64d126-bc30-4f96-8737-9a4b1cf2fe8a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapba818b19-9f')#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.682 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.683 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.683 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No VIF found with MAC fa:16:3e:0d:e6:fe, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.684 231315 INFO nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Using config drive#033[00m
Nov 23 16:13:56 np0005532763 nova_compute[231311]: 2025-11-23 21:13:56.712 231315 DEBUG nova.storage.rbd_utils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image c5f71bc7-14b7-4aae-992b-71709e979f38_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:13:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:13:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:13:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:13:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:13:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:13:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:57.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:57 np0005532763 nova_compute[231311]: 2025-11-23 21:13:57.348 231315 DEBUG nova.network.neutron [req-a422c8cd-0b8a-4ca5-a61e-899b92b8c907 req-750086d2-81d0-4def-9405-152ab077c2f8 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Updated VIF entry in instance network info cache for port ba818b19-9f72-4242-b9d9-b1630b5d1f24. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 23 16:13:57 np0005532763 nova_compute[231311]: 2025-11-23 21:13:57.349 231315 DEBUG nova.network.neutron [req-a422c8cd-0b8a-4ca5-a61e-899b92b8c907 req-750086d2-81d0-4def-9405-152ab077c2f8 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Updating instance_info_cache with network_info: [{"id": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "address": "fa:16:3e:0d:e6:fe", "network": {"id": "fd64d126-bc30-4f96-8737-9a4b1cf2fe8a", "bridge": "br-int", "label": "tempest-network-smoke--1300883220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapba818b19-9f", "ovs_interfaceid": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:13:57 np0005532763 nova_compute[231311]: 2025-11-23 21:13:57.363 231315 DEBUG oslo_concurrency.lockutils [req-a422c8cd-0b8a-4ca5-a61e-899b92b8c907 req-750086d2-81d0-4def-9405-152ab077c2f8 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Releasing lock "refresh_cache-c5f71bc7-14b7-4aae-992b-71709e979f38" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:13:57 np0005532763 nova_compute[231311]: 2025-11-23 21:13:57.392 231315 INFO nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Creating config drive at /var/lib/nova/instances/c5f71bc7-14b7-4aae-992b-71709e979f38/disk.config#033[00m
Nov 23 16:13:57 np0005532763 nova_compute[231311]: 2025-11-23 21:13:57.401 231315 DEBUG oslo_concurrency.processutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c5f71bc7-14b7-4aae-992b-71709e979f38/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa1zfkpx2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:13:57 np0005532763 nova_compute[231311]: 2025-11-23 21:13:57.451 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:57 np0005532763 nova_compute[231311]: 2025-11-23 21:13:57.543 231315 DEBUG oslo_concurrency.processutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c5f71bc7-14b7-4aae-992b-71709e979f38/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa1zfkpx2" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:13:57 np0005532763 nova_compute[231311]: 2025-11-23 21:13:57.589 231315 DEBUG nova.storage.rbd_utils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image c5f71bc7-14b7-4aae-992b-71709e979f38_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:13:57 np0005532763 nova_compute[231311]: 2025-11-23 21:13:57.594 231315 DEBUG oslo_concurrency.processutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c5f71bc7-14b7-4aae-992b-71709e979f38/disk.config c5f71bc7-14b7-4aae-992b-71709e979f38_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:13:57 np0005532763 nova_compute[231311]: 2025-11-23 21:13:57.804 231315 DEBUG oslo_concurrency.processutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c5f71bc7-14b7-4aae-992b-71709e979f38/disk.config c5f71bc7-14b7-4aae-992b-71709e979f38_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.211s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:13:57 np0005532763 nova_compute[231311]: 2025-11-23 21:13:57.807 231315 INFO nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Deleting local config drive /var/lib/nova/instances/c5f71bc7-14b7-4aae-992b-71709e979f38/disk.config because it was imported into RBD.#033[00m
Nov 23 16:13:57 np0005532763 virtqemud[230850]: End of file while reading data: Input/output error
Nov 23 16:13:57 np0005532763 virtqemud[230850]: End of file while reading data: Input/output error
Nov 23 16:13:57 np0005532763 kernel: tapba818b19-9f: entered promiscuous mode
Nov 23 16:13:57 np0005532763 NetworkManager[48849]: <info>  [1763932437.8912] manager: (tapba818b19-9f): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Nov 23 16:13:57 np0005532763 ovn_controller[133425]: 2025-11-23T21:13:57Z|00058|binding|INFO|Claiming lport ba818b19-9f72-4242-b9d9-b1630b5d1f24 for this chassis.
Nov 23 16:13:57 np0005532763 ovn_controller[133425]: 2025-11-23T21:13:57Z|00059|binding|INFO|ba818b19-9f72-4242-b9d9-b1630b5d1f24: Claiming fa:16:3e:0d:e6:fe 10.100.0.12
Nov 23 16:13:57 np0005532763 nova_compute[231311]: 2025-11-23 21:13:57.891 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:57 np0005532763 nova_compute[231311]: 2025-11-23 21:13:57.901 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:57 np0005532763 nova_compute[231311]: 2025-11-23 21:13:57.905 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:57 np0005532763 nova_compute[231311]: 2025-11-23 21:13:57.911 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:57 np0005532763 NetworkManager[48849]: <info>  [1763932437.9127] manager: (patch-provnet-ce139dcc-0def-41ea-bc8f-4f8d9359e223-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Nov 23 16:13:57 np0005532763 NetworkManager[48849]: <info>  [1763932437.9144] manager: (patch-br-int-to-provnet-ce139dcc-0def-41ea-bc8f-4f8d9359e223): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Nov 23 16:13:57 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:57.916 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:e6:fe 10.100.0.12'], port_security=['fa:16:3e:0d:e6:fe 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1655123038', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c5f71bc7-14b7-4aae-992b-71709e979f38', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1655123038', 'neutron:project_id': '782593db60784ab8bff41fe87d72ff5f', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'cfd1f7f1-25d4-42fe-ac59-ece898bff9bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.219'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1bc3d174-1770-40d5-b0cb-7f310bc5e484, chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], logical_port=ba818b19-9f72-4242-b9d9-b1630b5d1f24) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:13:57 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:57.919 142920 INFO neutron.agent.ovn.metadata.agent [-] Port ba818b19-9f72-4242-b9d9-b1630b5d1f24 in datapath fd64d126-bc30-4f96-8737-9a4b1cf2fe8a bound to our chassis#033[00m
Nov 23 16:13:57 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:57.921 142920 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fd64d126-bc30-4f96-8737-9a4b1cf2fe8a#033[00m
Nov 23 16:13:57 np0005532763 systemd-udevd[241592]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 16:13:57 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:57.939 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[d6a731a3-1c26-4178-adc5-2c444f5e12b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:57 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:57.941 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfd64d126-b1 in ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 23 16:13:57 np0005532763 systemd-machined[194484]: New machine qemu-5-instance-00000009.
Nov 23 16:13:57 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:57.944 235389 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfd64d126-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 23 16:13:57 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:57.944 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[aac46188-b10a-40e9-8e06-947067297788]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:57 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:57.945 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[fa5aa4bc-c0f6-4d1b-86ae-4126032b5794]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:57 np0005532763 NetworkManager[48849]: <info>  [1763932437.9561] device (tapba818b19-9f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 23 16:13:57 np0005532763 NetworkManager[48849]: <info>  [1763932437.9576] device (tapba818b19-9f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 23 16:13:57 np0005532763 systemd[1]: Started Virtual Machine qemu-5-instance-00000009.
Nov 23 16:13:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:57 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:57.966 143034 DEBUG oslo.privsep.daemon [-] privsep: reply[b61a65cc-4674-47a8-8125-202f34fb5d01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.002 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[6ec237b7-926d-4b7e-8207-d0233b433814]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.019 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.027 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:58 np0005532763 ovn_controller[133425]: 2025-11-23T21:13:58Z|00060|binding|INFO|Setting lport ba818b19-9f72-4242-b9d9-b1630b5d1f24 ovn-installed in OVS
Nov 23 16:13:58 np0005532763 ovn_controller[133425]: 2025-11-23T21:13:58Z|00061|binding|INFO|Setting lport ba818b19-9f72-4242-b9d9-b1630b5d1f24 up in Southbound
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.038 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.041 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[74034052-3a3c-49e2-9cfa-e6d981b3491c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.048 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[46cc0fb4-fdcf-450b-874e-272afa9d9eab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:58 np0005532763 NetworkManager[48849]: <info>  [1763932438.0490] manager: (tapfd64d126-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/46)
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.090 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[2d29025a-e0c0-4a8a-86ad-2115709b08ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.094 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[16ddefe1-cedd-443c-a204-bb12fc5e7fb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:58 np0005532763 NetworkManager[48849]: <info>  [1763932438.1231] device (tapfd64d126-b0): carrier: link connected
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.131 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[2f0a7f86-3177-43f2-9a4e-1c3a5edfd511]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.160 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[e5f33b7f-f1c5-4ccc-8a0d-22489340a27d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd64d126-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:c5:fb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438766, 'reachable_time': 19869, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241625, 'error': None, 'target': 'ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.185 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b46f6d-e1b4-4621-80ba-0cea37825202]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:c5fb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438766, 'tstamp': 438766}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241626, 'error': None, 'target': 'ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.217 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[b3019568-8ff9-418b-8609-53a3da4326bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd64d126-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:c5:fb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438766, 'reachable_time': 19869, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241634, 'error': None, 'target': 'ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.266 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[9c2fa310-0d38-445d-a8a1-cce86c863808]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.335 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[b8e9074c-1ac2-4eb3-addb-52df4416dd88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.337 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd64d126-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.338 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.338 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd64d126-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.340 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:58 np0005532763 NetworkManager[48849]: <info>  [1763932438.3415] manager: (tapfd64d126-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Nov 23 16:13:58 np0005532763 kernel: tapfd64d126-b0: entered promiscuous mode
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.346 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.349 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfd64d126-b0, col_values=(('external_ids', {'iface-id': '6ab19126-935d-4e09-a163-fbca05fb1c6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.351 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:58 np0005532763 ovn_controller[133425]: 2025-11-23T21:13:58Z|00062|binding|INFO|Releasing lport 6ab19126-935d-4e09-a163-fbca05fb1c6f from this chassis (sb_readonly=0)
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.352 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.353 142920 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fd64d126-bc30-4f96-8737-9a4b1cf2fe8a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fd64d126-bc30-4f96-8737-9a4b1cf2fe8a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.353 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[44d904c0-7efa-40ca-b9a7-33ba823468ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.354 142920 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: global
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    log         /dev/log local0 debug
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    log-tag     haproxy-metadata-proxy-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    user        root
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    group       root
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    maxconn     1024
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    pidfile     /var/lib/neutron/external/pids/fd64d126-bc30-4f96-8737-9a4b1cf2fe8a.pid.haproxy
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    daemon
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: defaults
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    log global
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    mode http
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    option httplog
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    option dontlognull
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    option http-server-close
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    option forwardfor
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    retries                 3
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    timeout http-request    30s
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    timeout connect         30s
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    timeout client          32s
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    timeout server          32s
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    timeout http-keep-alive 30s
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: listen listener
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    bind 169.254.169.254:80
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    server metadata /var/lib/neutron/metadata_proxy
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]:    http-request add-header X-OVN-Network-ID fd64d126-bc30-4f96-8737-9a4b1cf2fe8a
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 23 16:13:58 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:13:58.355 142920 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a', 'env', 'PROCESS_TAG=haproxy-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fd64d126-bc30-4f96-8737-9a4b1cf2fe8a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.364 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.389 231315 DEBUG nova.virt.driver [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Emitting event <LifecycleEvent: 1763932438.389045, c5f71bc7-14b7-4aae-992b-71709e979f38 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.390 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] VM Started (Lifecycle Event)#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.415 231315 DEBUG nova.compute.manager [req-586fd257-50a4-4374-8232-74d3d6118ea9 req-2879d188-a633-4f19-8be5-ff9b28865b1c 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Received event network-vif-plugged-ba818b19-9f72-4242-b9d9-b1630b5d1f24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.416 231315 DEBUG oslo_concurrency.lockutils [req-586fd257-50a4-4374-8232-74d3d6118ea9 req-2879d188-a633-4f19-8be5-ff9b28865b1c 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.416 231315 DEBUG oslo_concurrency.lockutils [req-586fd257-50a4-4374-8232-74d3d6118ea9 req-2879d188-a633-4f19-8be5-ff9b28865b1c 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.417 231315 DEBUG oslo_concurrency.lockutils [req-586fd257-50a4-4374-8232-74d3d6118ea9 req-2879d188-a633-4f19-8be5-ff9b28865b1c 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.418 231315 DEBUG nova.compute.manager [req-586fd257-50a4-4374-8232-74d3d6118ea9 req-2879d188-a633-4f19-8be5-ff9b28865b1c 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Processing event network-vif-plugged-ba818b19-9f72-4242-b9d9-b1630b5d1f24 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.420 231315 DEBUG nova.compute.manager [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.426 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.431 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.433 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 23 16:13:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:13:58.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.448 231315 INFO nova.virt.libvirt.driver [-] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Instance spawned successfully.#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.448 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.465 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.465 231315 DEBUG nova.virt.driver [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Emitting event <LifecycleEvent: 1763932438.3899653, c5f71bc7-14b7-4aae-992b-71709e979f38 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.465 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] VM Paused (Lifecycle Event)#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.473 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.473 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.474 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.474 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.474 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.475 231315 DEBUG nova.virt.libvirt.driver [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.500 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.506 231315 DEBUG nova.virt.driver [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Emitting event <LifecycleEvent: 1763932438.4295957, c5f71bc7-14b7-4aae-992b-71709e979f38 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.507 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] VM Resumed (Lifecycle Event)#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.538 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.543 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.549 231315 INFO nova.compute.manager [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Took 5.58 seconds to spawn the instance on the hypervisor.#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.550 231315 DEBUG nova.compute.manager [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.560 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.609 231315 INFO nova.compute.manager [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Took 6.58 seconds to build instance.#033[00m
Nov 23 16:13:58 np0005532763 nova_compute[231311]: 2025-11-23 21:13:58.624 231315 DEBUG oslo_concurrency.lockutils [None req-ec71e448-4f1e-4fb6-a2c0-e35b68699485 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "c5f71bc7-14b7-4aae-992b-71709e979f38" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:13:58 np0005532763 podman[241702]: 2025-11-23 21:13:58.852499564 +0000 UTC m=+0.098482664 container create d4fd7b64aaf85bf6180c4fdfbfe1add06a65cb534c37696247af8a30e74c36c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 23 16:13:58 np0005532763 systemd[1]: Started libpod-conmon-d4fd7b64aaf85bf6180c4fdfbfe1add06a65cb534c37696247af8a30e74c36c6.scope.
Nov 23 16:13:58 np0005532763 podman[241702]: 2025-11-23 21:13:58.813491127 +0000 UTC m=+0.059474297 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 23 16:13:58 np0005532763 systemd[1]: Started libcrun container.
Nov 23 16:13:58 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/13fd39c2e132f0ed39e8aa3e7fe59fc444071435ccd88105d1ae92aa48c2c50a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 16:13:58 np0005532763 podman[241702]: 2025-11-23 21:13:58.966517949 +0000 UTC m=+0.212501099 container init d4fd7b64aaf85bf6180c4fdfbfe1add06a65cb534c37696247af8a30e74c36c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 23 16:13:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:58 np0005532763 podman[241702]: 2025-11-23 21:13:58.973338909 +0000 UTC m=+0.219322019 container start d4fd7b64aaf85bf6180c4fdfbfe1add06a65cb534c37696247af8a30e74c36c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 16:13:59 np0005532763 neutron-haproxy-ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a[241717]: [NOTICE]   (241722) : New worker (241724) forked
Nov 23 16:13:59 np0005532763 neutron-haproxy-ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a[241717]: [NOTICE]   (241722) : Loading success.
Nov 23 16:13:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:13:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:13:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:13:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:13:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:13:59.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:13:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:13:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:13:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:00 np0005532763 podman[241734]: 2025-11-23 21:14:00.251651095 +0000 UTC m=+0.131953236 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 23 16:14:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:00.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:00 np0005532763 nova_compute[231311]: 2025-11-23 21:14:00.538 231315 DEBUG nova.compute.manager [req-88287083-159a-4be6-993e-fae47aec83b2 req-eb00d895-8efd-442b-98f1-ad65843c4e2b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Received event network-vif-plugged-ba818b19-9f72-4242-b9d9-b1630b5d1f24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:14:00 np0005532763 nova_compute[231311]: 2025-11-23 21:14:00.539 231315 DEBUG oslo_concurrency.lockutils [req-88287083-159a-4be6-993e-fae47aec83b2 req-eb00d895-8efd-442b-98f1-ad65843c4e2b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:14:00 np0005532763 nova_compute[231311]: 2025-11-23 21:14:00.539 231315 DEBUG oslo_concurrency.lockutils [req-88287083-159a-4be6-993e-fae47aec83b2 req-eb00d895-8efd-442b-98f1-ad65843c4e2b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:14:00 np0005532763 nova_compute[231311]: 2025-11-23 21:14:00.539 231315 DEBUG oslo_concurrency.lockutils [req-88287083-159a-4be6-993e-fae47aec83b2 req-eb00d895-8efd-442b-98f1-ad65843c4e2b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:14:00 np0005532763 nova_compute[231311]: 2025-11-23 21:14:00.540 231315 DEBUG nova.compute.manager [req-88287083-159a-4be6-993e-fae47aec83b2 req-eb00d895-8efd-442b-98f1-ad65843c4e2b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] No waiting events found dispatching network-vif-plugged-ba818b19-9f72-4242-b9d9-b1630b5d1f24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:14:00 np0005532763 nova_compute[231311]: 2025-11-23 21:14:00.540 231315 WARNING nova.compute.manager [req-88287083-159a-4be6-993e-fae47aec83b2 req-eb00d895-8efd-442b-98f1-ad65843c4e2b 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Received unexpected event network-vif-plugged-ba818b19-9f72-4242-b9d9-b1630b5d1f24 for instance with vm_state active and task_state None.#033[00m
Nov 23 16:14:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.033 231315 DEBUG oslo_concurrency.lockutils [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "c5f71bc7-14b7-4aae-992b-71709e979f38" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.033 231315 DEBUG oslo_concurrency.lockutils [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "c5f71bc7-14b7-4aae-992b-71709e979f38" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.034 231315 DEBUG oslo_concurrency.lockutils [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.034 231315 DEBUG oslo_concurrency.lockutils [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.035 231315 DEBUG oslo_concurrency.lockutils [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.036 231315 INFO nova.compute.manager [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Terminating instance#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.038 231315 DEBUG nova.compute.manager [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 23 16:14:01 np0005532763 kernel: tapba818b19-9f (unregistering): left promiscuous mode
Nov 23 16:14:01 np0005532763 NetworkManager[48849]: <info>  [1763932441.1321] device (tapba818b19-9f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.170 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:01 np0005532763 ovn_controller[133425]: 2025-11-23T21:14:01Z|00063|binding|INFO|Releasing lport ba818b19-9f72-4242-b9d9-b1630b5d1f24 from this chassis (sb_readonly=0)
Nov 23 16:14:01 np0005532763 ovn_controller[133425]: 2025-11-23T21:14:01Z|00064|binding|INFO|Setting lport ba818b19-9f72-4242-b9d9-b1630b5d1f24 down in Southbound
Nov 23 16:14:01 np0005532763 ovn_controller[133425]: 2025-11-23T21:14:01Z|00065|binding|INFO|Removing iface tapba818b19-9f ovn-installed in OVS
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.175 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:01 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:01.184 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:e6:fe 10.100.0.12'], port_security=['fa:16:3e:0d:e6:fe 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1655123038', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c5f71bc7-14b7-4aae-992b-71709e979f38', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1655123038', 'neutron:project_id': '782593db60784ab8bff41fe87d72ff5f', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'cfd1f7f1-25d4-42fe-ac59-ece898bff9bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.219', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1bc3d174-1770-40d5-b0cb-7f310bc5e484, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], logical_port=ba818b19-9f72-4242-b9d9-b1630b5d1f24) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:14:01 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:01.186 142920 INFO neutron.agent.ovn.metadata.agent [-] Port ba818b19-9f72-4242-b9d9-b1630b5d1f24 in datapath fd64d126-bc30-4f96-8737-9a4b1cf2fe8a unbound from our chassis#033[00m
Nov 23 16:14:01 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:01.188 142920 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd64d126-bc30-4f96-8737-9a4b1cf2fe8a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 23 16:14:01 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:01.191 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[29f848de-76ad-4945-aa3f-0d8a98be2b73]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:14:01 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:01.192 142920 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a namespace which is not needed anymore#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.200 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:01 np0005532763 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Deactivated successfully.
Nov 23 16:14:01 np0005532763 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Consumed 3.265s CPU time.
Nov 23 16:14:01 np0005532763 systemd-machined[194484]: Machine qemu-5-instance-00000009 terminated.
Nov 23 16:14:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.282 231315 INFO nova.virt.libvirt.driver [-] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Instance destroyed successfully.#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.282 231315 DEBUG nova.objects.instance [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'resources' on Instance uuid c5f71bc7-14b7-4aae-992b-71709e979f38 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.298 231315 DEBUG nova.virt.libvirt.vif [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-23T21:13:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-133268455',display_name='tempest-TestNetworkBasicOps-server-133268455',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-133268455',id=9,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBN9XcQAIBv32AOnQ/uBrGNwBZurj0yvrQVcMrvkyK3ctTPLKsvruhaMgHV7K2cU5FTazPRilFrIpMab0kt1oWc9vuxyQVHP2UQmIJWRSBUBYUtVpyBjHjDA9PZXQnhSgeQ==',key_name='tempest-TestNetworkBasicOps-486410375',keypairs=<?>,launch_index=0,launched_at=2025-11-23T21:13:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-qmtdh5xg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-23T21:13:58Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=c5f71bc7-14b7-4aae-992b-71709e979f38,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "address": "fa:16:3e:0d:e6:fe", "network": {"id": "fd64d126-bc30-4f96-8737-9a4b1cf2fe8a", "bridge": "br-int", "label": "tempest-network-smoke--1300883220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapba818b19-9f", "ovs_interfaceid": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.299 231315 DEBUG nova.network.os_vif_util [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "address": "fa:16:3e:0d:e6:fe", "network": {"id": "fd64d126-bc30-4f96-8737-9a4b1cf2fe8a", "bridge": "br-int", "label": "tempest-network-smoke--1300883220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapba818b19-9f", "ovs_interfaceid": "ba818b19-9f72-4242-b9d9-b1630b5d1f24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.300 231315 DEBUG nova.network.os_vif_util [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:e6:fe,bridge_name='br-int',has_traffic_filtering=True,id=ba818b19-9f72-4242-b9d9-b1630b5d1f24,network=Network(fd64d126-bc30-4f96-8737-9a4b1cf2fe8a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapba818b19-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.300 231315 DEBUG os_vif [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:e6:fe,bridge_name='br-int',has_traffic_filtering=True,id=ba818b19-9f72-4242-b9d9-b1630b5d1f24,network=Network(fd64d126-bc30-4f96-8737-9a4b1cf2fe8a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapba818b19-9f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 23 16:14:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:01.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.303 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.303 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapba818b19-9f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.305 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.307 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.311 231315 INFO os_vif [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:e6:fe,bridge_name='br-int',has_traffic_filtering=True,id=ba818b19-9f72-4242-b9d9-b1630b5d1f24,network=Network(fd64d126-bc30-4f96-8737-9a4b1cf2fe8a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapba818b19-9f')#033[00m
Nov 23 16:14:01 np0005532763 neutron-haproxy-ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a[241717]: [NOTICE]   (241722) : haproxy version is 2.8.14-c23fe91
Nov 23 16:14:01 np0005532763 neutron-haproxy-ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a[241717]: [NOTICE]   (241722) : path to executable is /usr/sbin/haproxy
Nov 23 16:14:01 np0005532763 neutron-haproxy-ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a[241717]: [WARNING]  (241722) : Exiting Master process...
Nov 23 16:14:01 np0005532763 neutron-haproxy-ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a[241717]: [WARNING]  (241722) : Exiting Master process...
Nov 23 16:14:01 np0005532763 neutron-haproxy-ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a[241717]: [ALERT]    (241722) : Current worker (241724) exited with code 143 (Terminated)
Nov 23 16:14:01 np0005532763 neutron-haproxy-ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a[241717]: [WARNING]  (241722) : All workers exited. Exiting... (0)
Nov 23 16:14:01 np0005532763 systemd[1]: libpod-d4fd7b64aaf85bf6180c4fdfbfe1add06a65cb534c37696247af8a30e74c36c6.scope: Deactivated successfully.
Nov 23 16:14:01 np0005532763 podman[241883]: 2025-11-23 21:14:01.395236898 +0000 UTC m=+0.055759244 container died d4fd7b64aaf85bf6180c4fdfbfe1add06a65cb534c37696247af8a30e74c36c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 23 16:14:01 np0005532763 podman[241874]: 2025-11-23 21:14:01.40464229 +0000 UTC m=+0.068096447 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible)
Nov 23 16:14:01 np0005532763 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d4fd7b64aaf85bf6180c4fdfbfe1add06a65cb534c37696247af8a30e74c36c6-userdata-shm.mount: Deactivated successfully.
Nov 23 16:14:01 np0005532763 systemd[1]: var-lib-containers-storage-overlay-13fd39c2e132f0ed39e8aa3e7fe59fc444071435ccd88105d1ae92aa48c2c50a-merged.mount: Deactivated successfully.
Nov 23 16:14:01 np0005532763 podman[241883]: 2025-11-23 21:14:01.441730103 +0000 UTC m=+0.102252419 container cleanup d4fd7b64aaf85bf6180c4fdfbfe1add06a65cb534c37696247af8a30e74c36c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118)
Nov 23 16:14:01 np0005532763 systemd[1]: libpod-conmon-d4fd7b64aaf85bf6180c4fdfbfe1add06a65cb534c37696247af8a30e74c36c6.scope: Deactivated successfully.
Nov 23 16:14:01 np0005532763 podman[241947]: 2025-11-23 21:14:01.521750451 +0000 UTC m=+0.055740433 container remove d4fd7b64aaf85bf6180c4fdfbfe1add06a65cb534c37696247af8a30e74c36c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 16:14:01 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:01.533 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[80042e31-deaa-464e-a664-228de966c85f]: (4, ('Sun Nov 23 09:14:01 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a (d4fd7b64aaf85bf6180c4fdfbfe1add06a65cb534c37696247af8a30e74c36c6)\nd4fd7b64aaf85bf6180c4fdfbfe1add06a65cb534c37696247af8a30e74c36c6\nSun Nov 23 09:14:01 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a (d4fd7b64aaf85bf6180c4fdfbfe1add06a65cb534c37696247af8a30e74c36c6)\nd4fd7b64aaf85bf6180c4fdfbfe1add06a65cb534c37696247af8a30e74c36c6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:14:01 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:01.536 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[adb85a08-500e-4f99-8fe7-2fd0078e8ca6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:14:01 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:01.537 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd64d126-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:14:01 np0005532763 kernel: tapfd64d126-b0: left promiscuous mode
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.541 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.544 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:01 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:01.549 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[9fa93202-e46b-4ffb-a5a3-07289035a92f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:14:01 np0005532763 nova_compute[231311]: 2025-11-23 21:14:01.567 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:01 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:01.580 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[a8430e93-ab69-44fe-9445-4f84d3456915]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:14:01 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:01.583 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[2003e3bf-9ace-4230-a91c-fd10b789cd67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:14:01 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:01.608 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[0367963a-70bc-4294-9f71-653e92080b99]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438757, 'reachable_time': 31684, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241962, 'error': None, 'target': 'ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:14:01 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:01.611 143034 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fd64d126-bc30-4f96-8737-9a4b1cf2fe8a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 23 16:14:01 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:01.611 143034 DEBUG oslo.privsep.daemon [-] privsep: reply[b4ee5fe8-01b2-459d-939e-588ce9074cbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:14:01 np0005532763 systemd[1]: run-netns-ovnmeta\x2dfd64d126\x2dbc30\x2d4f96\x2d8737\x2d9a4b1cf2fe8a.mount: Deactivated successfully.
Nov 23 16:14:01 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:14:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:14:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:14:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:14:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:14:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:02 np0005532763 nova_compute[231311]: 2025-11-23 21:14:02.379 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:14:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:02.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:02 np0005532763 nova_compute[231311]: 2025-11-23 21:14:02.455 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:02 np0005532763 nova_compute[231311]: 2025-11-23 21:14:02.656 231315 DEBUG nova.compute.manager [req-092058c8-f905-4671-bb3b-8033e95344f9 req-34023377-637d-4ed7-9086-05febd52c422 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Received event network-vif-unplugged-ba818b19-9f72-4242-b9d9-b1630b5d1f24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:14:02 np0005532763 nova_compute[231311]: 2025-11-23 21:14:02.658 231315 DEBUG oslo_concurrency.lockutils [req-092058c8-f905-4671-bb3b-8033e95344f9 req-34023377-637d-4ed7-9086-05febd52c422 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:14:02 np0005532763 nova_compute[231311]: 2025-11-23 21:14:02.658 231315 DEBUG oslo_concurrency.lockutils [req-092058c8-f905-4671-bb3b-8033e95344f9 req-34023377-637d-4ed7-9086-05febd52c422 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:14:02 np0005532763 nova_compute[231311]: 2025-11-23 21:14:02.659 231315 DEBUG oslo_concurrency.lockutils [req-092058c8-f905-4671-bb3b-8033e95344f9 req-34023377-637d-4ed7-9086-05febd52c422 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:14:02 np0005532763 nova_compute[231311]: 2025-11-23 21:14:02.659 231315 DEBUG nova.compute.manager [req-092058c8-f905-4671-bb3b-8033e95344f9 req-34023377-637d-4ed7-9086-05febd52c422 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] No waiting events found dispatching network-vif-unplugged-ba818b19-9f72-4242-b9d9-b1630b5d1f24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:14:02 np0005532763 nova_compute[231311]: 2025-11-23 21:14:02.660 231315 DEBUG nova.compute.manager [req-092058c8-f905-4671-bb3b-8033e95344f9 req-34023377-637d-4ed7-9086-05febd52c422 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Received event network-vif-unplugged-ba818b19-9f72-4242-b9d9-b1630b5d1f24 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 23 16:14:02 np0005532763 nova_compute[231311]: 2025-11-23 21:14:02.661 231315 DEBUG nova.compute.manager [req-092058c8-f905-4671-bb3b-8033e95344f9 req-34023377-637d-4ed7-9086-05febd52c422 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Received event network-vif-plugged-ba818b19-9f72-4242-b9d9-b1630b5d1f24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:14:02 np0005532763 nova_compute[231311]: 2025-11-23 21:14:02.661 231315 DEBUG oslo_concurrency.lockutils [req-092058c8-f905-4671-bb3b-8033e95344f9 req-34023377-637d-4ed7-9086-05febd52c422 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:14:02 np0005532763 nova_compute[231311]: 2025-11-23 21:14:02.662 231315 DEBUG oslo_concurrency.lockutils [req-092058c8-f905-4671-bb3b-8033e95344f9 req-34023377-637d-4ed7-9086-05febd52c422 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:14:02 np0005532763 nova_compute[231311]: 2025-11-23 21:14:02.663 231315 DEBUG oslo_concurrency.lockutils [req-092058c8-f905-4671-bb3b-8033e95344f9 req-34023377-637d-4ed7-9086-05febd52c422 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "c5f71bc7-14b7-4aae-992b-71709e979f38-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:14:02 np0005532763 nova_compute[231311]: 2025-11-23 21:14:02.663 231315 DEBUG nova.compute.manager [req-092058c8-f905-4671-bb3b-8033e95344f9 req-34023377-637d-4ed7-9086-05febd52c422 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] No waiting events found dispatching network-vif-plugged-ba818b19-9f72-4242-b9d9-b1630b5d1f24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:14:02 np0005532763 nova_compute[231311]: 2025-11-23 21:14:02.664 231315 WARNING nova.compute.manager [req-092058c8-f905-4671-bb3b-8033e95344f9 req-34023377-637d-4ed7-9086-05febd52c422 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Received unexpected event network-vif-plugged-ba818b19-9f72-4242-b9d9-b1630b5d1f24 for instance with vm_state active and task_state deleting.#033[00m
Nov 23 16:14:02 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:14:02 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:14:02 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:14:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:03 np0005532763 nova_compute[231311]: 2025-11-23 21:14:03.274 231315 INFO nova.virt.libvirt.driver [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Deleting instance files /var/lib/nova/instances/c5f71bc7-14b7-4aae-992b-71709e979f38_del#033[00m
Nov 23 16:14:03 np0005532763 nova_compute[231311]: 2025-11-23 21:14:03.275 231315 INFO nova.virt.libvirt.driver [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Deletion of /var/lib/nova/instances/c5f71bc7-14b7-4aae-992b-71709e979f38_del complete#033[00m
Nov 23 16:14:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:03.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:03 np0005532763 nova_compute[231311]: 2025-11-23 21:14:03.324 231315 INFO nova.compute.manager [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Took 2.29 seconds to destroy the instance on the hypervisor.#033[00m
Nov 23 16:14:03 np0005532763 nova_compute[231311]: 2025-11-23 21:14:03.324 231315 DEBUG oslo.service.loopingcall [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 23 16:14:03 np0005532763 nova_compute[231311]: 2025-11-23 21:14:03.325 231315 DEBUG nova.compute.manager [-] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 23 16:14:03 np0005532763 nova_compute[231311]: 2025-11-23 21:14:03.325 231315 DEBUG nova.network.neutron [-] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 23 16:14:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:04 np0005532763 nova_compute[231311]: 2025-11-23 21:14:04.325 231315 DEBUG nova.network.neutron [-] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:14:04 np0005532763 nova_compute[231311]: 2025-11-23 21:14:04.345 231315 INFO nova.compute.manager [-] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Took 1.02 seconds to deallocate network for instance.#033[00m
Nov 23 16:14:04 np0005532763 nova_compute[231311]: 2025-11-23 21:14:04.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:14:04 np0005532763 nova_compute[231311]: 2025-11-23 21:14:04.402 231315 DEBUG oslo_concurrency.lockutils [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:14:04 np0005532763 nova_compute[231311]: 2025-11-23 21:14:04.403 231315 DEBUG oslo_concurrency.lockutils [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:14:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:04.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:04 np0005532763 nova_compute[231311]: 2025-11-23 21:14:04.457 231315 DEBUG oslo_concurrency.processutils [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:14:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:14:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:14:04 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4100265904' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:14:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:04 np0005532763 nova_compute[231311]: 2025-11-23 21:14:04.969 231315 DEBUG oslo_concurrency.processutils [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:14:04 np0005532763 nova_compute[231311]: 2025-11-23 21:14:04.978 231315 DEBUG nova.compute.provider_tree [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:14:04 np0005532763 nova_compute[231311]: 2025-11-23 21:14:04.992 231315 DEBUG nova.scheduler.client.report [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:14:05 np0005532763 nova_compute[231311]: 2025-11-23 21:14:05.010 231315 DEBUG oslo_concurrency.lockutils [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.607s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:14:05 np0005532763 nova_compute[231311]: 2025-11-23 21:14:05.033 231315 INFO nova.scheduler.client.report [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Deleted allocations for instance c5f71bc7-14b7-4aae-992b-71709e979f38#033[00m
Nov 23 16:14:05 np0005532763 nova_compute[231311]: 2025-11-23 21:14:05.083 231315 DEBUG oslo_concurrency.lockutils [None req-c1f223ba-fa55-4f80-9149-713e674d6ad0 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "c5f71bc7-14b7-4aae-992b-71709e979f38" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:14:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:05.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:05 np0005532763 nova_compute[231311]: 2025-11-23 21:14:05.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:14:05 np0005532763 nova_compute[231311]: 2025-11-23 21:14:05.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:14:05 np0005532763 nova_compute[231311]: 2025-11-23 21:14:05.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:14:05 np0005532763 nova_compute[231311]: 2025-11-23 21:14:05.402 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:14:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:06 np0005532763 nova_compute[231311]: 2025-11-23 21:14:06.306 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:06 np0005532763 nova_compute[231311]: 2025-11-23 21:14:06.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:14:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:06.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:06 np0005532763 nova_compute[231311]: 2025-11-23 21:14:06.754 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:06.754 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3a:26:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:d5:4d:db:d5:2b'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:14:06 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:06.755 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 23 16:14:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:14:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:14:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:14:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:14:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:07.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:07 np0005532763 nova_compute[231311]: 2025-11-23 21:14:07.378 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:14:07 np0005532763 nova_compute[231311]: 2025-11-23 21:14:07.457 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:07 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:07.757 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10e3bf57-dd2d-4b94-851f-925bcd297dde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:14:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:14:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:14:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:08 np0005532763 nova_compute[231311]: 2025-11-23 21:14:08.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:14:08 np0005532763 nova_compute[231311]: 2025-11-23 21:14:08.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:14:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:08.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:09.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:09 np0005532763 nova_compute[231311]: 2025-11-23 21:14:09.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:14:09 np0005532763 nova_compute[231311]: 2025-11-23 21:14:09.401 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:14:09 np0005532763 nova_compute[231311]: 2025-11-23 21:14:09.401 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:14:09 np0005532763 nova_compute[231311]: 2025-11-23 21:14:09.402 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:14:09 np0005532763 nova_compute[231311]: 2025-11-23 21:14:09.402 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:14:09 np0005532763 nova_compute[231311]: 2025-11-23 21:14:09.403 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:14:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:14:09 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2913995880' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:14:09 np0005532763 nova_compute[231311]: 2025-11-23 21:14:09.903 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:14:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:14:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:10 np0005532763 nova_compute[231311]: 2025-11-23 21:14:10.151 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:14:10 np0005532763 nova_compute[231311]: 2025-11-23 21:14:10.154 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4861MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:14:10 np0005532763 nova_compute[231311]: 2025-11-23 21:14:10.155 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:14:10 np0005532763 nova_compute[231311]: 2025-11-23 21:14:10.155 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:14:10 np0005532763 nova_compute[231311]: 2025-11-23 21:14:10.207 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:14:10 np0005532763 nova_compute[231311]: 2025-11-23 21:14:10.208 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:14:10 np0005532763 nova_compute[231311]: 2025-11-23 21:14:10.223 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:14:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:10.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:14:10 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/376672714' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:14:10 np0005532763 nova_compute[231311]: 2025-11-23 21:14:10.698 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:14:10 np0005532763 nova_compute[231311]: 2025-11-23 21:14:10.706 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:14:10 np0005532763 nova_compute[231311]: 2025-11-23 21:14:10.718 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:14:10 np0005532763 nova_compute[231311]: 2025-11-23 21:14:10.738 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:14:10 np0005532763 nova_compute[231311]: 2025-11-23 21:14:10.738 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:14:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:10 np0005532763 nova_compute[231311]: 2025-11-23 21:14:10.981 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:11 np0005532763 nova_compute[231311]: 2025-11-23 21:14:11.052 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:11 np0005532763 nova_compute[231311]: 2025-11-23 21:14:11.308 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:11.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:11 np0005532763 nova_compute[231311]: 2025-11-23 21:14:11.739 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:14:11 np0005532763 nova_compute[231311]: 2025-11-23 21:14:11.740 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:14:11 np0005532763 nova_compute[231311]: 2025-11-23 21:14:11.740 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:14:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:14:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:14:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:14:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:14:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:12.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:12 np0005532763 nova_compute[231311]: 2025-11-23 21:14:12.459 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:13.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:14.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:14:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:15.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:16 np0005532763 nova_compute[231311]: 2025-11-23 21:14:16.280 231315 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763932441.2792144, c5f71bc7-14b7-4aae-992b-71709e979f38 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:14:16 np0005532763 nova_compute[231311]: 2025-11-23 21:14:16.283 231315 INFO nova.compute.manager [-] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] VM Stopped (Lifecycle Event)#033[00m
Nov 23 16:14:16 np0005532763 nova_compute[231311]: 2025-11-23 21:14:16.308 231315 DEBUG nova.compute.manager [None req-0c86b516-63c4-412f-a1b6-1a85166c243f - - - - - -] [instance: c5f71bc7-14b7-4aae-992b-71709e979f38] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:14:16 np0005532763 nova_compute[231311]: 2025-11-23 21:14:16.312 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:16.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:14:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:14:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:14:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:14:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:17.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:17 np0005532763 nova_compute[231311]: 2025-11-23 21:14:17.514 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:14:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:18.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:14:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:19.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:14:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:20.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:21 np0005532763 nova_compute[231311]: 2025-11-23 21:14:21.316 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:21.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:14:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:14:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:14:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:14:22 np0005532763 podman[242103]: 2025-11-23 21:14:22.216373578 +0000 UTC m=+0.089258297 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 23 16:14:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:22.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:22 np0005532763 nova_compute[231311]: 2025-11-23 21:14:22.516 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:23.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:24.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:14:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:25.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:26 np0005532763 nova_compute[231311]: 2025-11-23 21:14:26.320 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:26.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:14:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:14:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:14:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:14:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:14:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:27.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:14:27 np0005532763 nova_compute[231311]: 2025-11-23 21:14:27.519 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:28.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:29.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:14:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:30.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:31 np0005532763 podman[242157]: 2025-11-23 21:14:31.264593373 +0000 UTC m=+0.137194422 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Nov 23 16:14:31 np0005532763 nova_compute[231311]: 2025-11-23 21:14:31.321 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:31.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:14:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:14:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:14:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:14:32 np0005532763 podman[242185]: 2025-11-23 21:14:32.226713863 +0000 UTC m=+0.100126829 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 23 16:14:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:32.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:32 np0005532763 nova_compute[231311]: 2025-11-23 21:14:32.524 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:33.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:34.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:14:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:35.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:36 np0005532763 nova_compute[231311]: 2025-11-23 21:14:36.364 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:36.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:14:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:14:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:14:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:14:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:37.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:37 np0005532763 nova_compute[231311]: 2025-11-23 21:14:37.557 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:38.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:39.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:14:39.406335) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932479406374, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1113, "num_deletes": 503, "total_data_size": 1750029, "memory_usage": 1778176, "flush_reason": "Manual Compaction"}
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932479416030, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 1006534, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30342, "largest_seqno": 31449, "table_properties": {"data_size": 1002115, "index_size": 1559, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 13867, "raw_average_key_size": 19, "raw_value_size": 991050, "raw_average_value_size": 1393, "num_data_blocks": 68, "num_entries": 711, "num_filter_entries": 711, "num_deletions": 503, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763932421, "oldest_key_time": 1763932421, "file_creation_time": 1763932479, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 9756 microseconds, and 5936 cpu microseconds.
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:14:39.416089) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 1006534 bytes OK
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:14:39.416115) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:14:39.417685) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:14:39.417715) EVENT_LOG_v1 {"time_micros": 1763932479417706, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:14:39.417740) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 1743607, prev total WAL file size 1743607, number of live WAL files 2.
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:14:39.419034) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(982KB)], [57(16MB)]
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932479419080, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 18726860, "oldest_snapshot_seqno": -1}
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5783 keys, 12640217 bytes, temperature: kUnknown
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932479494481, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 12640217, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12603483, "index_size": 21200, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14469, "raw_key_size": 149697, "raw_average_key_size": 25, "raw_value_size": 12501085, "raw_average_value_size": 2161, "num_data_blocks": 849, "num_entries": 5783, "num_filter_entries": 5783, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763932479, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:14:39.495542) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 12640217 bytes
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:14:39.497440) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 246.1 rd, 166.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 16.9 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(31.2) write-amplify(12.6) OK, records in: 6795, records dropped: 1012 output_compression: NoCompression
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:14:39.497477) EVENT_LOG_v1 {"time_micros": 1763932479497461, "job": 34, "event": "compaction_finished", "compaction_time_micros": 76089, "compaction_time_cpu_micros": 52657, "output_level": 6, "num_output_files": 1, "total_output_size": 12640217, "num_input_records": 6795, "num_output_records": 5783, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932479498907, "job": 34, "event": "table_file_deletion", "file_number": 59}
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932479506377, "job": 34, "event": "table_file_deletion", "file_number": 57}
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:14:39.418947) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:14:39.506467) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:14:39.506476) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:14:39.506479) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:14:39.506482) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:14:39.506485) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:14:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:14:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:40.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:41 np0005532763 nova_compute[231311]: 2025-11-23 21:14:41.367 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:41.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:14:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:14:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:14:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:14:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:42.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:42 np0005532763 nova_compute[231311]: 2025-11-23 21:14:42.559 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:43.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:44.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:14:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:45.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:46 np0005532763 nova_compute[231311]: 2025-11-23 21:14:46.370 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:46.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:14:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:14:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:14:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:14:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:47.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:47 np0005532763 nova_compute[231311]: 2025-11-23 21:14:47.562 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:48.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:49.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:49 np0005532763 ovn_controller[133425]: 2025-11-23T21:14:49Z|00066|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 23 16:14:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:14:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:50.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:51 np0005532763 nova_compute[231311]: 2025-11-23 21:14:51.373 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:51.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:14:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:14:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:14:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:14:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:52.229 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:14:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:52.230 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:14:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:14:52.230 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:14:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:52.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:52 np0005532763 nova_compute[231311]: 2025-11-23 21:14:52.601 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:53 np0005532763 podman[242251]: 2025-11-23 21:14:53.213207142 +0000 UTC m=+0.086838537 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 23 16:14:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:53.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:14:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:54.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:14:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:14:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:55.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:56 np0005532763 nova_compute[231311]: 2025-11-23 21:14:56.376 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:56.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:14:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:14:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:14:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:14:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:14:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:57.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:57 np0005532763 nova_compute[231311]: 2025-11-23 21:14:57.604 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:14:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:14:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:14:58.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:14:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:14:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:14:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:14:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:14:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:14:59.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:14:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:14:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:14:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:00.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:01.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:01 np0005532763 nova_compute[231311]: 2025-11-23 21:15:01.419 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:15:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:15:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:15:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:15:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:02 np0005532763 podman[242280]: 2025-11-23 21:15:02.271617912 +0000 UTC m=+0.146121160 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 16:15:02 np0005532763 podman[242306]: 2025-11-23 21:15:02.39663678 +0000 UTC m=+0.089194653 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 16:15:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:02.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:02 np0005532763 nova_compute[231311]: 2025-11-23 21:15:02.642 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:03.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:04.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:15:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:05 np0005532763 nova_compute[231311]: 2025-11-23 21:15:05.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:15:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:05.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:06 np0005532763 nova_compute[231311]: 2025-11-23 21:15:06.465 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:06.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:15:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:15:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:15:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:15:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:07 np0005532763 nova_compute[231311]: 2025-11-23 21:15:07.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:15:07 np0005532763 nova_compute[231311]: 2025-11-23 21:15:07.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:15:07 np0005532763 nova_compute[231311]: 2025-11-23 21:15:07.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:15:07 np0005532763 nova_compute[231311]: 2025-11-23 21:15:07.397 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:15:07 np0005532763 nova_compute[231311]: 2025-11-23 21:15:07.398 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:15:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:07.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:07 np0005532763 nova_compute[231311]: 2025-11-23 21:15:07.690 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:07 np0005532763 nova_compute[231311]: 2025-11-23 21:15:07.712 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:07 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:07.711 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3a:26:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:d5:4d:db:d5:2b'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:15:07 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:07.713 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 23 16:15:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:15:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3390450181' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:15:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:15:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3390450181' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:15:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:08 np0005532763 nova_compute[231311]: 2025-11-23 21:15:08.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:15:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:08.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:15:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:15:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 23 16:15:08 np0005532763 podman[242480]: 2025-11-23 21:15:08.87557039 +0000 UTC m=+0.084824201 container exec 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Nov 23 16:15:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:08 np0005532763 podman[242480]: 2025-11-23 21:15:08.997635995 +0000 UTC m=+0.206889776 container exec_died 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325)
Nov 23 16:15:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:09 np0005532763 nova_compute[231311]: 2025-11-23 21:15:09.378 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:15:09 np0005532763 nova_compute[231311]: 2025-11-23 21:15:09.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:15:09 np0005532763 nova_compute[231311]: 2025-11-23 21:15:09.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:15:09 np0005532763 nova_compute[231311]: 2025-11-23 21:15:09.404 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:15:09 np0005532763 nova_compute[231311]: 2025-11-23 21:15:09.405 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:15:09 np0005532763 nova_compute[231311]: 2025-11-23 21:15:09.405 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:15:09 np0005532763 nova_compute[231311]: 2025-11-23 21:15:09.406 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:15:09 np0005532763 nova_compute[231311]: 2025-11-23 21:15:09.406 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:15:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:09.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:09 np0005532763 podman[242615]: 2025-11-23 21:15:09.706227708 +0000 UTC m=+0.095122670 container exec bfa89024a4f3a8c3745fbdf8141ab9c1af6ff603988de647c9e7f7e15dff8638 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 16:15:09 np0005532763 podman[242615]: 2025-11-23 21:15:09.718614605 +0000 UTC m=+0.107509517 container exec_died bfa89024a4f3a8c3745fbdf8141ab9c1af6ff603988de647c9e7f7e15dff8638 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 16:15:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:15:09 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2727990036' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:15:09 np0005532763 nova_compute[231311]: 2025-11-23 21:15:09.915 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:15:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:15:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:10 np0005532763 nova_compute[231311]: 2025-11-23 21:15:10.141 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:15:10 np0005532763 nova_compute[231311]: 2025-11-23 21:15:10.142 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4887MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:15:10 np0005532763 nova_compute[231311]: 2025-11-23 21:15:10.142 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:15:10 np0005532763 nova_compute[231311]: 2025-11-23 21:15:10.142 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:15:10 np0005532763 nova_compute[231311]: 2025-11-23 21:15:10.201 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:15:10 np0005532763 nova_compute[231311]: 2025-11-23 21:15:10.201 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:15:10 np0005532763 nova_compute[231311]: 2025-11-23 21:15:10.218 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:15:10 np0005532763 podman[242706]: 2025-11-23 21:15:10.231998201 +0000 UTC m=+0.074086100 container exec 10ce05665482e9899a7eee0ab4547bdd9a9d872d3217d9554617c432a64e912a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 16:15:10 np0005532763 podman[242706]: 2025-11-23 21:15:10.248884945 +0000 UTC m=+0.090972804 container exec_died 10ce05665482e9899a7eee0ab4547bdd9a9d872d3217d9554617c432a64e912a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 23 16:15:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:15:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:15:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:10.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:10 np0005532763 podman[242788]: 2025-11-23 21:15:10.560511629 +0000 UTC m=+0.079500682 container exec 187afc4c1e67339be091cc4caff41c0e2aaba4673fc086f757180d516596ee6c (image=quay.io/ceph/haproxy:2.3, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem)
Nov 23 16:15:10 np0005532763 podman[242788]: 2025-11-23 21:15:10.571837037 +0000 UTC m=+0.090826100 container exec_died 187afc4c1e67339be091cc4caff41c0e2aaba4673fc086f757180d516596ee6c (image=quay.io/ceph/haproxy:2.3, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem)
Nov 23 16:15:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:15:10 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1766472216' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:15:10 np0005532763 nova_compute[231311]: 2025-11-23 21:15:10.705 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:15:10 np0005532763 nova_compute[231311]: 2025-11-23 21:15:10.714 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:15:10 np0005532763 nova_compute[231311]: 2025-11-23 21:15:10.728 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:15:10 np0005532763 nova_compute[231311]: 2025-11-23 21:15:10.731 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:15:10 np0005532763 nova_compute[231311]: 2025-11-23 21:15:10.732 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:15:10 np0005532763 podman[242858]: 2025-11-23 21:15:10.863076989 +0000 UTC m=+0.080949182 container exec f83166e24f35928d8e85c6352ec69e598c685dd22eb2d34bc93aec691f658844 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt, io.openshift.expose-services=, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, architecture=x86_64, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, distribution-scope=public)
Nov 23 16:15:10 np0005532763 podman[242858]: 2025-11-23 21:15:10.882681809 +0000 UTC m=+0.100553972 container exec_died f83166e24f35928d8e85c6352ec69e598c685dd22eb2d34bc93aec691f658844 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.openshift.expose-services=, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, name=keepalived, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., version=2.2.4, io.openshift.tags=Ceph keepalived, architecture=x86_64, build-date=2023-02-22T09:23:20)
Nov 23 16:15:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:11.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:11 np0005532763 nova_compute[231311]: 2025-11-23 21:15:11.468 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:11 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:15:11 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:15:11 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 23 16:15:11 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:15:11 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:15:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:15:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:15:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:15:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:15:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:12.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:12 np0005532763 nova_compute[231311]: 2025-11-23 21:15:12.693 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:12 np0005532763 nova_compute[231311]: 2025-11-23 21:15:12.733 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:15:12 np0005532763 nova_compute[231311]: 2025-11-23 21:15:12.733 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:15:12 np0005532763 nova_compute[231311]: 2025-11-23 21:15:12.734 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:15:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:13.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:13 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:15:13 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:15:13 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 23 16:15:13 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:15:13 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:15:13 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:15:13 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:15:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:14.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:14 np0005532763 ceph-mon[75752]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Nov 23 16:15:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:15:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:15.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:16 np0005532763 nova_compute[231311]: 2025-11-23 21:15:16.472 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:15:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:16.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:15:16 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:16.714 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10e3bf57-dd2d-4b94-851f-925bcd297dde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:15:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:15:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:15:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:15:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:15:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:17.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:17 np0005532763 nova_compute[231311]: 2025-11-23 21:15:17.696 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:17 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:15:17 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:15:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:18.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:19 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:15:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:19.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:15:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:20.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:21.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:21 np0005532763 nova_compute[231311]: 2025-11-23 21:15:21.476 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:15:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:15:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:15:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:15:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:22.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:22 np0005532763 nova_compute[231311]: 2025-11-23 21:15:22.698 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:23.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:24 np0005532763 podman[243116]: 2025-11-23 21:15:24.208742109 +0000 UTC m=+0.082609819 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 23 16:15:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:24.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:15:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:25.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:26 np0005532763 nova_compute[231311]: 2025-11-23 21:15:26.478 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:26.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:26 np0005532763 nova_compute[231311]: 2025-11-23 21:15:26.891 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:15:26 np0005532763 nova_compute[231311]: 2025-11-23 21:15:26.892 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:15:26 np0005532763 nova_compute[231311]: 2025-11-23 21:15:26.905 231315 DEBUG nova.compute.manager [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 23 16:15:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:26 np0005532763 nova_compute[231311]: 2025-11-23 21:15:26.989 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:15:26 np0005532763 nova_compute[231311]: 2025-11-23 21:15:26.989 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:15:26 np0005532763 nova_compute[231311]: 2025-11-23 21:15:26.999 231315 DEBUG nova.virt.hardware [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:26.999 231315 INFO nova.compute.claims [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 23 16:15:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:15:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:15:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:15:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.079 231315 DEBUG oslo_concurrency.processutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:15:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:27.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:27 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:15:27 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2993740014' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.570 231315 DEBUG oslo_concurrency.processutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.576 231315 DEBUG nova.compute.provider_tree [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.594 231315 DEBUG nova.scheduler.client.report [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.619 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.619 231315 DEBUG nova.compute.manager [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.681 231315 DEBUG nova.compute.manager [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.681 231315 DEBUG nova.network.neutron [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.699 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.702 231315 INFO nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.723 231315 DEBUG nova.compute.manager [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.828 231315 DEBUG nova.compute.manager [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.831 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.832 231315 INFO nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Creating image(s)#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.867 231315 DEBUG nova.storage.rbd_utils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.900 231315 DEBUG nova.storage.rbd_utils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.936 231315 DEBUG nova.storage.rbd_utils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:15:27 np0005532763 nova_compute[231311]: 2025-11-23 21:15:27.940 231315 DEBUG oslo_concurrency.processutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:15:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:28 np0005532763 nova_compute[231311]: 2025-11-23 21:15:28.031 231315 DEBUG oslo_concurrency.processutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:15:28 np0005532763 nova_compute[231311]: 2025-11-23 21:15:28.033 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:15:28 np0005532763 nova_compute[231311]: 2025-11-23 21:15:28.034 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:15:28 np0005532763 nova_compute[231311]: 2025-11-23 21:15:28.035 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "bbf6854ee7b640c267652b783cf7d20bc820aa56" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:15:28 np0005532763 nova_compute[231311]: 2025-11-23 21:15:28.074 231315 DEBUG nova.storage.rbd_utils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:15:28 np0005532763 nova_compute[231311]: 2025-11-23 21:15:28.078 231315 DEBUG oslo_concurrency.processutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:15:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:28 np0005532763 nova_compute[231311]: 2025-11-23 21:15:28.358 231315 DEBUG nova.policy [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9fb5352c62684f2ba3a326a953a10dfe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '782593db60784ab8bff41fe87d72ff5f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 23 16:15:28 np0005532763 nova_compute[231311]: 2025-11-23 21:15:28.406 231315 DEBUG oslo_concurrency.processutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/bbf6854ee7b640c267652b783cf7d20bc820aa56 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.328s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:15:28 np0005532763 nova_compute[231311]: 2025-11-23 21:15:28.517 231315 DEBUG nova.storage.rbd_utils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] resizing rbd image 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Nov 23 16:15:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:28.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:28 np0005532763 nova_compute[231311]: 2025-11-23 21:15:28.666 231315 DEBUG nova.objects.instance [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'migration_context' on Instance uuid 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:15:28 np0005532763 nova_compute[231311]: 2025-11-23 21:15:28.685 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 23 16:15:28 np0005532763 nova_compute[231311]: 2025-11-23 21:15:28.686 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Ensure instance console log exists: /var/lib/nova/instances/8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 23 16:15:28 np0005532763 nova_compute[231311]: 2025-11-23 21:15:28.686 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:15:28 np0005532763 nova_compute[231311]: 2025-11-23 21:15:28.687 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:15:28 np0005532763 nova_compute[231311]: 2025-11-23 21:15:28.687 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:15:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:29.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:29 np0005532763 nova_compute[231311]: 2025-11-23 21:15:29.686 231315 DEBUG nova.network.neutron [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Successfully created port: 311750e0-f35b-4107-a9fb-c1a3c3b4d928 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 23 16:15:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:15:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:30.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:31.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:31 np0005532763 nova_compute[231311]: 2025-11-23 21:15:31.456 231315 DEBUG nova.network.neutron [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Successfully updated port: 311750e0-f35b-4107-a9fb-c1a3c3b4d928 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 23 16:15:31 np0005532763 nova_compute[231311]: 2025-11-23 21:15:31.472 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:15:31 np0005532763 nova_compute[231311]: 2025-11-23 21:15:31.473 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquired lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:15:31 np0005532763 nova_compute[231311]: 2025-11-23 21:15:31.473 231315 DEBUG nova.network.neutron [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 23 16:15:31 np0005532763 nova_compute[231311]: 2025-11-23 21:15:31.525 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:31 np0005532763 nova_compute[231311]: 2025-11-23 21:15:31.614 231315 DEBUG nova.compute.manager [req-547c7b8c-91be-42d8-8b05-c7237ef4cd4d req-7f4f7c1d-c4ac-4969-ab6e-0879cc5ea396 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-changed-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:15:31 np0005532763 nova_compute[231311]: 2025-11-23 21:15:31.615 231315 DEBUG nova.compute.manager [req-547c7b8c-91be-42d8-8b05-c7237ef4cd4d req-7f4f7c1d-c4ac-4969-ab6e-0879cc5ea396 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Refreshing instance network info cache due to event network-changed-311750e0-f35b-4107-a9fb-c1a3c3b4d928. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 23 16:15:31 np0005532763 nova_compute[231311]: 2025-11-23 21:15:31.616 231315 DEBUG oslo_concurrency.lockutils [req-547c7b8c-91be-42d8-8b05-c7237ef4cd4d req-7f4f7c1d-c4ac-4969-ab6e-0879cc5ea396 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:15:31 np0005532763 nova_compute[231311]: 2025-11-23 21:15:31.735 231315 DEBUG nova.network.neutron [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 23 16:15:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:15:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:15:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:15:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:15:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:32.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:32 np0005532763 nova_compute[231311]: 2025-11-23 21:15:32.739 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.090 231315 DEBUG nova.network.neutron [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Updating instance_info_cache with network_info: [{"id": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "address": "fa:16:3e:7c:77:fb", "network": {"id": "2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7", "bridge": "br-int", "label": "tempest-network-smoke--943664961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap311750e0-f3", "ovs_interfaceid": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.105 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Releasing lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.105 231315 DEBUG nova.compute.manager [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Instance network_info: |[{"id": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "address": "fa:16:3e:7c:77:fb", "network": {"id": "2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7", "bridge": "br-int", "label": "tempest-network-smoke--943664961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap311750e0-f3", "ovs_interfaceid": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.106 231315 DEBUG oslo_concurrency.lockutils [req-547c7b8c-91be-42d8-8b05-c7237ef4cd4d req-7f4f7c1d-c4ac-4969-ab6e-0879cc5ea396 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquired lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.106 231315 DEBUG nova.network.neutron [req-547c7b8c-91be-42d8-8b05-c7237ef4cd4d req-7f4f7c1d-c4ac-4969-ab6e-0879cc5ea396 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Refreshing network info cache for port 311750e0-f35b-4107-a9fb-c1a3c3b4d928 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.111 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Start _get_guest_xml network_info=[{"id": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "address": "fa:16:3e:7c:77:fb", "network": {"id": "2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7", "bridge": "br-int", "label": "tempest-network-smoke--943664961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap311750e0-f3", "ovs_interfaceid": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-23T21:05:43Z,direct_url=<?>,disk_format='qcow2',id=3c45fa6c-8a99-4359-a34e-d89f4e1e77d0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3f8fb5175f85402ba20cf9c6989d47cf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-23T21:05:47Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'boot_index': 0, 'encryption_format': None, 'encryption_options': None, 'size': 0, 'image_id': '3c45fa6c-8a99-4359-a34e-d89f4e1e77d0'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.119 231315 WARNING nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.131 231315 DEBUG nova.virt.libvirt.host [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.132 231315 DEBUG nova.virt.libvirt.host [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.136 231315 DEBUG nova.virt.libvirt.host [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.137 231315 DEBUG nova.virt.libvirt.host [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.138 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.138 231315 DEBUG nova.virt.hardware [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-23T21:05:42Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='56044b93-2979-48aa-b67f-c37e1b489306',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-23T21:05:43Z,direct_url=<?>,disk_format='qcow2',id=3c45fa6c-8a99-4359-a34e-d89f4e1e77d0,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='3f8fb5175f85402ba20cf9c6989d47cf',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-23T21:05:47Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.139 231315 DEBUG nova.virt.hardware [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.140 231315 DEBUG nova.virt.hardware [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.140 231315 DEBUG nova.virt.hardware [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.140 231315 DEBUG nova.virt.hardware [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.141 231315 DEBUG nova.virt.hardware [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.141 231315 DEBUG nova.virt.hardware [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.142 231315 DEBUG nova.virt.hardware [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.143 231315 DEBUG nova.virt.hardware [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.143 231315 DEBUG nova.virt.hardware [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.143 231315 DEBUG nova.virt.hardware [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.148 231315 DEBUG oslo_concurrency.processutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:15:33 np0005532763 podman[243359]: 2025-11-23 21:15:33.214575444 +0000 UTC m=+0.079342458 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 23 16:15:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:33 np0005532763 podman[243360]: 2025-11-23 21:15:33.297320396 +0000 UTC m=+0.159649831 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 16:15:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:33.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:33 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 16:15:33 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1002385892' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.616 231315 DEBUG oslo_concurrency.processutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.653 231315 DEBUG nova.storage.rbd_utils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:15:33 np0005532763 nova_compute[231311]: 2025-11-23 21:15:33.659 231315 DEBUG oslo_concurrency.processutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:15:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Nov 23 16:15:34 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3826056991' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.139 231315 DEBUG oslo_concurrency.processutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.141 231315 DEBUG nova.virt.libvirt.vif [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T21:15:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1078764307',display_name='tempest-TestNetworkBasicOps-server-1078764307',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1078764307',id=11,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBN0kySXjrbMEpZa+5DFuh5NyWqVIVYSWPP7YdIvlLi4UKs7COQcUL8O1z0kJF5qaLUlgiXSen4gzGNy2fKqXGEebH9mjccMVNm22QV/7ekbOLUHkkHW6JYEeXM6zTKwGw==',key_name='tempest-TestNetworkBasicOps-557128906',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-qt7qhyho',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-23T21:15:27Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "address": "fa:16:3e:7c:77:fb", "network": {"id": "2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7", "bridge": "br-int", "label": "tempest-network-smoke--943664961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap311750e0-f3", "ovs_interfaceid": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.142 231315 DEBUG nova.network.os_vif_util [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "address": "fa:16:3e:7c:77:fb", "network": {"id": "2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7", "bridge": "br-int", "label": "tempest-network-smoke--943664961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap311750e0-f3", "ovs_interfaceid": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.142 231315 DEBUG nova.network.os_vif_util [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:77:fb,bridge_name='br-int',has_traffic_filtering=True,id=311750e0-f35b-4107-a9fb-c1a3c3b4d928,network=Network(2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap311750e0-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.144 231315 DEBUG nova.objects.instance [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'pci_devices' on Instance uuid 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.159 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] End _get_guest_xml xml=<domain type="kvm">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  <uuid>8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0</uuid>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  <name>instance-0000000b</name>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  <memory>131072</memory>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  <vcpu>1</vcpu>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  <metadata>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <nova:name>tempest-TestNetworkBasicOps-server-1078764307</nova:name>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <nova:creationTime>2025-11-23 21:15:33</nova:creationTime>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <nova:flavor name="m1.nano">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        <nova:memory>128</nova:memory>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        <nova:disk>1</nova:disk>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        <nova:swap>0</nova:swap>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        <nova:ephemeral>0</nova:ephemeral>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        <nova:vcpus>1</nova:vcpus>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      </nova:flavor>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <nova:owner>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        <nova:user uuid="9fb5352c62684f2ba3a326a953a10dfe">tempest-TestNetworkBasicOps-1975357669-project-member</nova:user>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        <nova:project uuid="782593db60784ab8bff41fe87d72ff5f">tempest-TestNetworkBasicOps-1975357669</nova:project>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      </nova:owner>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <nova:root type="image" uuid="3c45fa6c-8a99-4359-a34e-d89f4e1e77d0"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <nova:ports>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        <nova:port uuid="311750e0-f35b-4107-a9fb-c1a3c3b4d928">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        </nova:port>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      </nova:ports>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    </nova:instance>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  </metadata>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  <sysinfo type="smbios">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <system>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <entry name="manufacturer">RDO</entry>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <entry name="product">OpenStack Compute</entry>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <entry name="serial">8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0</entry>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <entry name="uuid">8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0</entry>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <entry name="family">Virtual Machine</entry>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    </system>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  </sysinfo>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  <os>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <boot dev="hd"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <smbios mode="sysinfo"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  </os>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  <features>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <acpi/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <apic/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <vmcoreinfo/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  </features>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  <clock offset="utc">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <timer name="pit" tickpolicy="delay"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <timer name="hpet" present="no"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  </clock>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  <cpu mode="host-model" match="exact">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <topology sockets="1" cores="1" threads="1"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  </cpu>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  <devices>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <disk type="network" device="disk">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <driver type="raw" cache="none"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <source protocol="rbd" name="vms/8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0_disk">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        <host name="192.168.122.100" port="6789"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        <host name="192.168.122.102" port="6789"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        <host name="192.168.122.101" port="6789"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      </source>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <auth username="openstack">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        <secret type="ceph" uuid="03808be8-ae4a-5548-82e6-4a294f1bc627"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      </auth>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <target dev="vda" bus="virtio"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    </disk>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <disk type="network" device="cdrom">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <driver type="raw" cache="none"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <source protocol="rbd" name="vms/8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0_disk.config">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        <host name="192.168.122.100" port="6789"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        <host name="192.168.122.102" port="6789"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        <host name="192.168.122.101" port="6789"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      </source>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <auth username="openstack">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:        <secret type="ceph" uuid="03808be8-ae4a-5548-82e6-4a294f1bc627"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      </auth>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <target dev="sda" bus="sata"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    </disk>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <interface type="ethernet">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <mac address="fa:16:3e:7c:77:fb"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <model type="virtio"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <driver name="vhost" rx_queue_size="512"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <mtu size="1442"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <target dev="tap311750e0-f3"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    </interface>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <serial type="pty">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <log file="/var/lib/nova/instances/8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0/console.log" append="off"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    </serial>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <video>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <model type="virtio"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    </video>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <input type="tablet" bus="usb"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <rng model="virtio">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <backend model="random">/dev/urandom</backend>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    </rng>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="pci" model="pcie-root-port"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <controller type="usb" index="0"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    <memballoon model="virtio">
Nov 23 16:15:34 np0005532763 nova_compute[231311]:      <stats period="10"/>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:    </memballoon>
Nov 23 16:15:34 np0005532763 nova_compute[231311]:  </devices>
Nov 23 16:15:34 np0005532763 nova_compute[231311]: </domain>
Nov 23 16:15:34 np0005532763 nova_compute[231311]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.159 231315 DEBUG nova.compute.manager [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Preparing to wait for external event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.160 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.160 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.161 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.161 231315 DEBUG nova.virt.libvirt.vif [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-23T21:15:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1078764307',display_name='tempest-TestNetworkBasicOps-server-1078764307',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1078764307',id=11,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBN0kySXjrbMEpZa+5DFuh5NyWqVIVYSWPP7YdIvlLi4UKs7COQcUL8O1z0kJF5qaLUlgiXSen4gzGNy2fKqXGEebH9mjccMVNm22QV/7ekbOLUHkkHW6JYEeXM6zTKwGw==',key_name='tempest-TestNetworkBasicOps-557128906',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-qt7qhyho',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-23T21:15:27Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "address": "fa:16:3e:7c:77:fb", "network": {"id": "2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7", "bridge": "br-int", "label": "tempest-network-smoke--943664961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap311750e0-f3", "ovs_interfaceid": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.161 231315 DEBUG nova.network.os_vif_util [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "address": "fa:16:3e:7c:77:fb", "network": {"id": "2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7", "bridge": "br-int", "label": "tempest-network-smoke--943664961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap311750e0-f3", "ovs_interfaceid": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.162 231315 DEBUG nova.network.os_vif_util [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:77:fb,bridge_name='br-int',has_traffic_filtering=True,id=311750e0-f35b-4107-a9fb-c1a3c3b4d928,network=Network(2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap311750e0-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.162 231315 DEBUG os_vif [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:77:fb,bridge_name='br-int',has_traffic_filtering=True,id=311750e0-f35b-4107-a9fb-c1a3c3b4d928,network=Network(2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap311750e0-f3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.163 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.163 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.164 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.168 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.168 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap311750e0-f3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.169 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap311750e0-f3, col_values=(('external_ids', {'iface-id': '311750e0-f35b-4107-a9fb-c1a3c3b4d928', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:77:fb', 'vm-uuid': '8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.171 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:34 np0005532763 NetworkManager[48849]: <info>  [1763932534.1729] manager: (tap311750e0-f3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.174 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.179 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.180 231315 INFO os_vif [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:77:fb,bridge_name='br-int',has_traffic_filtering=True,id=311750e0-f35b-4107-a9fb-c1a3c3b4d928,network=Network(2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap311750e0-f3')#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.195 231315 DEBUG nova.network.neutron [req-547c7b8c-91be-42d8-8b05-c7237ef4cd4d req-7f4f7c1d-c4ac-4969-ab6e-0879cc5ea396 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Updated VIF entry in instance network info cache for port 311750e0-f35b-4107-a9fb-c1a3c3b4d928. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.197 231315 DEBUG nova.network.neutron [req-547c7b8c-91be-42d8-8b05-c7237ef4cd4d req-7f4f7c1d-c4ac-4969-ab6e-0879cc5ea396 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Updating instance_info_cache with network_info: [{"id": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "address": "fa:16:3e:7c:77:fb", "network": {"id": "2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7", "bridge": "br-int", "label": "tempest-network-smoke--943664961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap311750e0-f3", "ovs_interfaceid": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.216 231315 DEBUG oslo_concurrency.lockutils [req-547c7b8c-91be-42d8-8b05-c7237ef4cd4d req-7f4f7c1d-c4ac-4969-ab6e-0879cc5ea396 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Releasing lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.229 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.229 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.230 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] No VIF found with MAC fa:16:3e:7c:77:fb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.230 231315 INFO nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Using config drive#033[00m
Nov 23 16:15:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.269 231315 DEBUG nova.storage.rbd_utils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.511 231315 INFO nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Creating config drive at /var/lib/nova/instances/8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0/disk.config#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.520 231315 DEBUG oslo_concurrency.processutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpind6yeoy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:15:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:34.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.661 231315 DEBUG oslo_concurrency.processutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpind6yeoy" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.708 231315 DEBUG nova.storage.rbd_utils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] rbd image 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.712 231315 DEBUG oslo_concurrency.processutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0/disk.config 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.910 231315 DEBUG oslo_concurrency.processutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0/disk.config 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.198s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.911 231315 INFO nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Deleting local config drive /var/lib/nova/instances/8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0/disk.config because it was imported into RBD.#033[00m
Nov 23 16:15:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:15:34 np0005532763 kernel: tap311750e0-f3: entered promiscuous mode
Nov 23 16:15:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:34 np0005532763 NetworkManager[48849]: <info>  [1763932534.9853] manager: (tap311750e0-f3): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.985 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:34 np0005532763 ovn_controller[133425]: 2025-11-23T21:15:34Z|00067|binding|INFO|Claiming lport 311750e0-f35b-4107-a9fb-c1a3c3b4d928 for this chassis.
Nov 23 16:15:34 np0005532763 ovn_controller[133425]: 2025-11-23T21:15:34Z|00068|binding|INFO|311750e0-f35b-4107-a9fb-c1a3c3b4d928: Claiming fa:16:3e:7c:77:fb 10.100.0.6
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.993 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:34 np0005532763 nova_compute[231311]: 2025-11-23 21:15:34.997 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.023 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:77:fb 10.100.0.6'], port_security=['fa:16:3e:7c:77:fb 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782593db60784ab8bff41fe87d72ff5f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0928f1ff-3405-41cf-b57a-21a867de524f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=831ed7cd-9739-4cae-9853-0a7c3c8eb72f, chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], logical_port=311750e0-f35b-4107-a9fb-c1a3c3b4d928) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.026 142920 INFO neutron.agent.ovn.metadata.agent [-] Port 311750e0-f35b-4107-a9fb-c1a3c3b4d928 in datapath 2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7 bound to our chassis#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.027 142920 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7#033[00m
Nov 23 16:15:35 np0005532763 systemd-machined[194484]: New machine qemu-6-instance-0000000b.
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.046 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[7c217c32-96f4-4b78-abc7-f3612d748c57]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.047 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2b2cbb2b-41 in ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.049 235389 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2b2cbb2b-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.050 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e7e1c3-d517-4263-ad11-e05beafc65a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.051 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a6343e-eac3-480b-b10a-ad3d884bca35]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.063 143034 DEBUG oslo.privsep.daemon [-] privsep: reply[0192d916-92c4-4dc6-8104-cca720fb8b8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:15:35 np0005532763 systemd[1]: Started Virtual Machine qemu-6-instance-0000000b.
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.077 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:35 np0005532763 ovn_controller[133425]: 2025-11-23T21:15:35Z|00069|binding|INFO|Setting lport 311750e0-f35b-4107-a9fb-c1a3c3b4d928 ovn-installed in OVS
Nov 23 16:15:35 np0005532763 ovn_controller[133425]: 2025-11-23T21:15:35Z|00070|binding|INFO|Setting lport 311750e0-f35b-4107-a9fb-c1a3c3b4d928 up in Southbound
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.082 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.082 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[11bcdc22-24c7-4b06-8d77-c3ed4963721d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:15:35 np0005532763 systemd-udevd[243541]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 16:15:35 np0005532763 NetworkManager[48849]: <info>  [1763932535.1150] device (tap311750e0-f3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 23 16:15:35 np0005532763 NetworkManager[48849]: <info>  [1763932535.1169] device (tap311750e0-f3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.123 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[491d19a8-ff6a-4a16-b740-1d17fd6056af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:15:35 np0005532763 systemd-udevd[243545]: Network interface NamePolicy= disabled on kernel command line.
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.129 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[1e3352e2-cd65-4d51-b473-f2fe843d9a9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:15:35 np0005532763 NetworkManager[48849]: <info>  [1763932535.1323] manager: (tap2b2cbb2b-40): new Veth device (/org/freedesktop/NetworkManager/Devices/50)
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.174 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[b145155b-7c86-4929-9562-119f68fc6117]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.179 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[c1be08b9-24a2-4891-9a1e-5d5ab869af23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:15:35 np0005532763 NetworkManager[48849]: <info>  [1763932535.2172] device (tap2b2cbb2b-40): carrier: link connected
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.227 235405 DEBUG oslo.privsep.daemon [-] privsep: reply[ac9bc4e9-820d-485d-8ddd-f91706bc6397]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.254 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[84d9069c-f55b-4ac8-865e-38fad23044b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2b2cbb2b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a2:f6:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448475, 'reachable_time': 25407, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243571, 'error': None, 'target': 'ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:15:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.278 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[90271e8e-26fb-41cd-b133-4ce7a8509147]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea2:f637'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 448475, 'tstamp': 448475}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243572, 'error': None, 'target': 'ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.303 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[e41e7092-b6f1-44ce-9ef7-77ca12e5a164]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2b2cbb2b-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a2:f6:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448475, 'reachable_time': 25407, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243573, 'error': None, 'target': 'ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.350 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[f2817f0a-9db5-49f1-9d48-4b687f81cd2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.434 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[1b7d3e6d-8ed7-4046-b961-747f8d927908]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.436 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2b2cbb2b-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.436 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.437 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2b2cbb2b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.439 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:35 np0005532763 NetworkManager[48849]: <info>  [1763932535.4404] manager: (tap2b2cbb2b-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Nov 23 16:15:35 np0005532763 kernel: tap2b2cbb2b-40: entered promiscuous mode
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.444 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2b2cbb2b-40, col_values=(('external_ids', {'iface-id': '7a9e60a2-aaf5-412e-8508-c425a028014e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:15:35 np0005532763 ovn_controller[133425]: 2025-11-23T21:15:35Z|00071|binding|INFO|Releasing lport 7a9e60a2-aaf5-412e-8508-c425a028014e from this chassis (sb_readonly=0)
Nov 23 16:15:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:35.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.472 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.475 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.476 142920 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.477 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[67eee963-6917-4be7-8808-19b21fb15867]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.478 142920 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: global
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    log         /dev/log local0 debug
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    log-tag     haproxy-metadata-proxy-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    user        root
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    group       root
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    maxconn     1024
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    pidfile     /var/lib/neutron/external/pids/2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7.pid.haproxy
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    daemon
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: defaults
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    log global
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    mode http
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    option httplog
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    option dontlognull
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    option http-server-close
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    option forwardfor
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    retries                 3
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    timeout http-request    30s
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    timeout connect         30s
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    timeout client          32s
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    timeout server          32s
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    timeout http-keep-alive 30s
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: listen listener
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    bind 169.254.169.254:80
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    server metadata /var/lib/neutron/metadata_proxy
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]:    http-request add-header X-OVN-Network-ID 2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 23 16:15:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:35.479 142920 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7', 'env', 'PROCESS_TAG=haproxy-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.521 231315 DEBUG nova.compute.manager [req-d76135fd-62ee-44b8-9abe-c6f5a320ba0a req-26a5fa28-e79f-42b1-9923-51741c5969e8 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.522 231315 DEBUG oslo_concurrency.lockutils [req-d76135fd-62ee-44b8-9abe-c6f5a320ba0a req-26a5fa28-e79f-42b1-9923-51741c5969e8 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.522 231315 DEBUG oslo_concurrency.lockutils [req-d76135fd-62ee-44b8-9abe-c6f5a320ba0a req-26a5fa28-e79f-42b1-9923-51741c5969e8 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.523 231315 DEBUG oslo_concurrency.lockutils [req-d76135fd-62ee-44b8-9abe-c6f5a320ba0a req-26a5fa28-e79f-42b1-9923-51741c5969e8 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.523 231315 DEBUG nova.compute.manager [req-d76135fd-62ee-44b8-9abe-c6f5a320ba0a req-26a5fa28-e79f-42b1-9923-51741c5969e8 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Processing event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.767 231315 DEBUG nova.compute.manager [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.768 231315 DEBUG nova.virt.driver [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Emitting event <LifecycleEvent: 1763932535.7664895, 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.768 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] VM Started (Lifecycle Event)#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.775 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.779 231315 INFO nova.virt.libvirt.driver [-] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Instance spawned successfully.#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.780 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.793 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.802 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.807 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.807 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.808 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.809 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.809 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.810 231315 DEBUG nova.virt.libvirt.driver [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.818 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.819 231315 DEBUG nova.virt.driver [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Emitting event <LifecycleEvent: 1763932535.7667139, 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.819 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] VM Paused (Lifecycle Event)#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.839 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.843 231315 DEBUG nova.virt.driver [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] Emitting event <LifecycleEvent: 1763932535.7740862, 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.844 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] VM Resumed (Lifecycle Event)#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.861 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.867 231315 DEBUG nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.871 231315 INFO nova.compute.manager [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Took 8.04 seconds to spawn the instance on the hypervisor.#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.872 231315 DEBUG nova.compute.manager [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.882 231315 INFO nova.compute.manager [None req-77b075fb-cfcc-4bc6-a04e-953a57029bf0 - - - - - -] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.928 231315 INFO nova.compute.manager [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Took 8.98 seconds to build instance.#033[00m
Nov 23 16:15:35 np0005532763 nova_compute[231311]: 2025-11-23 21:15:35.941 231315 DEBUG oslo_concurrency.lockutils [None req-96e077c5-bc6b-4f0c-a4a2-70f0b20772cc 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:15:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:36 np0005532763 podman[243648]: 2025-11-23 21:15:36.001016402 +0000 UTC m=+0.077613869 container create 6b1a8af9f52464e924aca9da73d730b9a7daf4123738fbffbd21f97be809a739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 16:15:36 np0005532763 systemd[1]: Started libpod-conmon-6b1a8af9f52464e924aca9da73d730b9a7daf4123738fbffbd21f97be809a739.scope.
Nov 23 16:15:36 np0005532763 podman[243648]: 2025-11-23 21:15:35.964010903 +0000 UTC m=+0.040608370 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 23 16:15:36 np0005532763 systemd[1]: Started libcrun container.
Nov 23 16:15:36 np0005532763 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e220df071e0c0b64bb122b7ea991beef96f9caca80f47a8e59e0ae62e18bb522/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 23 16:15:36 np0005532763 podman[243648]: 2025-11-23 21:15:36.10289181 +0000 UTC m=+0.179489247 container init 6b1a8af9f52464e924aca9da73d730b9a7daf4123738fbffbd21f97be809a739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 16:15:36 np0005532763 podman[243648]: 2025-11-23 21:15:36.112561002 +0000 UTC m=+0.189158439 container start 6b1a8af9f52464e924aca9da73d730b9a7daf4123738fbffbd21f97be809a739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 23 16:15:36 np0005532763 neutron-haproxy-ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7[243663]: [NOTICE]   (243667) : New worker (243669) forked
Nov 23 16:15:36 np0005532763 neutron-haproxy-ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7[243663]: [NOTICE]   (243667) : Loading success.
Nov 23 16:15:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:36.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:15:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:15:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:15:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:15:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:37.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:37 np0005532763 nova_compute[231311]: 2025-11-23 21:15:37.613 231315 DEBUG nova.compute.manager [req-f89c3736-aa8f-46e6-ad74-2b0b18472887 req-ecbbb146-120b-427e-8837-22364b6b0bfb 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:15:37 np0005532763 nova_compute[231311]: 2025-11-23 21:15:37.614 231315 DEBUG oslo_concurrency.lockutils [req-f89c3736-aa8f-46e6-ad74-2b0b18472887 req-ecbbb146-120b-427e-8837-22364b6b0bfb 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:15:37 np0005532763 nova_compute[231311]: 2025-11-23 21:15:37.614 231315 DEBUG oslo_concurrency.lockutils [req-f89c3736-aa8f-46e6-ad74-2b0b18472887 req-ecbbb146-120b-427e-8837-22364b6b0bfb 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:15:37 np0005532763 nova_compute[231311]: 2025-11-23 21:15:37.615 231315 DEBUG oslo_concurrency.lockutils [req-f89c3736-aa8f-46e6-ad74-2b0b18472887 req-ecbbb146-120b-427e-8837-22364b6b0bfb 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:15:37 np0005532763 nova_compute[231311]: 2025-11-23 21:15:37.615 231315 DEBUG nova.compute.manager [req-f89c3736-aa8f-46e6-ad74-2b0b18472887 req-ecbbb146-120b-427e-8837-22364b6b0bfb 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] No waiting events found dispatching network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:15:37 np0005532763 nova_compute[231311]: 2025-11-23 21:15:37.616 231315 WARNING nova.compute.manager [req-f89c3736-aa8f-46e6-ad74-2b0b18472887 req-ecbbb146-120b-427e-8837-22364b6b0bfb 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received unexpected event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 for instance with vm_state active and task_state None.#033[00m
Nov 23 16:15:37 np0005532763 nova_compute[231311]: 2025-11-23 21:15:37.743 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:38.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:39 np0005532763 nova_compute[231311]: 2025-11-23 21:15:39.172 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:39 np0005532763 ovn_controller[133425]: 2025-11-23T21:15:39Z|00072|binding|INFO|Releasing lport 7a9e60a2-aaf5-412e-8508-c425a028014e from this chassis (sb_readonly=0)
Nov 23 16:15:39 np0005532763 nova_compute[231311]: 2025-11-23 21:15:39.333 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:39 np0005532763 NetworkManager[48849]: <info>  [1763932539.3344] manager: (patch-provnet-ce139dcc-0def-41ea-bc8f-4f8d9359e223-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Nov 23 16:15:39 np0005532763 NetworkManager[48849]: <info>  [1763932539.3362] manager: (patch-br-int-to-provnet-ce139dcc-0def-41ea-bc8f-4f8d9359e223): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Nov 23 16:15:39 np0005532763 ovn_controller[133425]: 2025-11-23T21:15:39Z|00073|binding|INFO|Releasing lport 7a9e60a2-aaf5-412e-8508-c425a028014e from this chassis (sb_readonly=0)
Nov 23 16:15:39 np0005532763 nova_compute[231311]: 2025-11-23 21:15:39.398 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:39 np0005532763 nova_compute[231311]: 2025-11-23 21:15:39.403 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:39.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:39 np0005532763 nova_compute[231311]: 2025-11-23 21:15:39.676 231315 DEBUG nova.compute.manager [req-c67ba1b8-f4ec-460d-8b1f-614d1560fca3 req-255abf32-9693-47ea-90a3-71d0e51a7780 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-changed-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:15:39 np0005532763 nova_compute[231311]: 2025-11-23 21:15:39.677 231315 DEBUG nova.compute.manager [req-c67ba1b8-f4ec-460d-8b1f-614d1560fca3 req-255abf32-9693-47ea-90a3-71d0e51a7780 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Refreshing instance network info cache due to event network-changed-311750e0-f35b-4107-a9fb-c1a3c3b4d928. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 23 16:15:39 np0005532763 nova_compute[231311]: 2025-11-23 21:15:39.678 231315 DEBUG oslo_concurrency.lockutils [req-c67ba1b8-f4ec-460d-8b1f-614d1560fca3 req-255abf32-9693-47ea-90a3-71d0e51a7780 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:15:39 np0005532763 nova_compute[231311]: 2025-11-23 21:15:39.678 231315 DEBUG oslo_concurrency.lockutils [req-c67ba1b8-f4ec-460d-8b1f-614d1560fca3 req-255abf32-9693-47ea-90a3-71d0e51a7780 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquired lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:15:39 np0005532763 nova_compute[231311]: 2025-11-23 21:15:39.678 231315 DEBUG nova.network.neutron [req-c67ba1b8-f4ec-460d-8b1f-614d1560fca3 req-255abf32-9693-47ea-90a3-71d0e51a7780 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Refreshing network info cache for port 311750e0-f35b-4107-a9fb-c1a3c3b4d928 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 23 16:15:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:15:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:40.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:41.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:41 np0005532763 nova_compute[231311]: 2025-11-23 21:15:41.549 231315 DEBUG nova.network.neutron [req-c67ba1b8-f4ec-460d-8b1f-614d1560fca3 req-255abf32-9693-47ea-90a3-71d0e51a7780 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Updated VIF entry in instance network info cache for port 311750e0-f35b-4107-a9fb-c1a3c3b4d928. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 23 16:15:41 np0005532763 nova_compute[231311]: 2025-11-23 21:15:41.550 231315 DEBUG nova.network.neutron [req-c67ba1b8-f4ec-460d-8b1f-614d1560fca3 req-255abf32-9693-47ea-90a3-71d0e51a7780 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Updating instance_info_cache with network_info: [{"id": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "address": "fa:16:3e:7c:77:fb", "network": {"id": "2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7", "bridge": "br-int", "label": "tempest-network-smoke--943664961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap311750e0-f3", "ovs_interfaceid": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:15:41 np0005532763 nova_compute[231311]: 2025-11-23 21:15:41.571 231315 DEBUG oslo_concurrency.lockutils [req-c67ba1b8-f4ec-460d-8b1f-614d1560fca3 req-255abf32-9693-47ea-90a3-71d0e51a7780 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Releasing lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:15:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:15:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:15:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:15:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:15:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:42.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:42 np0005532763 nova_compute[231311]: 2025-11-23 21:15:42.745 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:43.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:44 np0005532763 nova_compute[231311]: 2025-11-23 21:15:44.176 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:44.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:15:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:45.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:46.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:15:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:15:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:15:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:15:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:47.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:47 np0005532763 nova_compute[231311]: 2025-11-23 21:15:47.783 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:48 np0005532763 ovn_controller[133425]: 2025-11-23T21:15:48Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7c:77:fb 10.100.0.6
Nov 23 16:15:48 np0005532763 ovn_controller[133425]: 2025-11-23T21:15:48Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7c:77:fb 10.100.0.6
Nov 23 16:15:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:48.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:49 np0005532763 nova_compute[231311]: 2025-11-23 21:15:49.179 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:49.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:15:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:50.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:51.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:15:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:15:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:15:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:15:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:52.230 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:15:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:52.231 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:15:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:15:52.232 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:15:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:52.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:52 np0005532763 nova_compute[231311]: 2025-11-23 21:15:52.786 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:53.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:54 np0005532763 nova_compute[231311]: 2025-11-23 21:15:54.181 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:54.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:15:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:55 np0005532763 podman[243726]: 2025-11-23 21:15:55.218254489 +0000 UTC m=+0.090186842 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 23 16:15:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:55.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:56.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:15:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:15:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:15:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:15:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:15:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:15:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:57.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:15:57 np0005532763 nova_compute[231311]: 2025-11-23 21:15:57.821 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:15:58.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:59 np0005532763 nova_compute[231311]: 2025-11-23 21:15:59.185 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:15:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:15:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:15:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:15:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:15:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:15:59.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:15:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:15:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:15:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:00.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:01.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:16:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:16:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:16:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:16:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:02.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:02 np0005532763 nova_compute[231311]: 2025-11-23 21:16:02.866 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:03.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:04 np0005532763 nova_compute[231311]: 2025-11-23 21:16:04.187 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:04 np0005532763 podman[243756]: 2025-11-23 21:16:04.213112076 +0000 UTC m=+0.087715732 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Nov 23 16:16:04 np0005532763 podman[243757]: 2025-11-23 21:16:04.245843085 +0000 UTC m=+0.117334764 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 23 16:16:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:04.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:16:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:05 np0005532763 nova_compute[231311]: 2025-11-23 21:16:05.379 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:16:05 np0005532763 nova_compute[231311]: 2025-11-23 21:16:05.395 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:16:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:05.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:06.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:16:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:16:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:16:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:16:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:07 np0005532763 nova_compute[231311]: 2025-11-23 21:16:07.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:16:07 np0005532763 nova_compute[231311]: 2025-11-23 21:16:07.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:16:07 np0005532763 nova_compute[231311]: 2025-11-23 21:16:07.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:16:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:07.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:07 np0005532763 nova_compute[231311]: 2025-11-23 21:16:07.868 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:08 np0005532763 nova_compute[231311]: 2025-11-23 21:16:08.316 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:16:08 np0005532763 nova_compute[231311]: 2025-11-23 21:16:08.316 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquired lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:16:08 np0005532763 nova_compute[231311]: 2025-11-23 21:16:08.317 231315 DEBUG nova.network.neutron [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 23 16:16:08 np0005532763 nova_compute[231311]: 2025-11-23 21:16:08.317 231315 DEBUG nova.objects.instance [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:16:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:08.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:09 np0005532763 nova_compute[231311]: 2025-11-23 21:16:09.189 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:09.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:16:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:10.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:10 np0005532763 nova_compute[231311]: 2025-11-23 21:16:10.833 231315 DEBUG nova.network.neutron [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Updating instance_info_cache with network_info: [{"id": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "address": "fa:16:3e:7c:77:fb", "network": {"id": "2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7", "bridge": "br-int", "label": "tempest-network-smoke--943664961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap311750e0-f3", "ovs_interfaceid": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:16:10 np0005532763 nova_compute[231311]: 2025-11-23 21:16:10.845 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Releasing lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:16:10 np0005532763 nova_compute[231311]: 2025-11-23 21:16:10.846 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 23 16:16:10 np0005532763 nova_compute[231311]: 2025-11-23 21:16:10.847 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:16:10 np0005532763 nova_compute[231311]: 2025-11-23 21:16:10.847 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:16:10 np0005532763 nova_compute[231311]: 2025-11-23 21:16:10.847 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:16:10 np0005532763 nova_compute[231311]: 2025-11-23 21:16:10.866 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:16:10 np0005532763 nova_compute[231311]: 2025-11-23 21:16:10.867 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:16:10 np0005532763 nova_compute[231311]: 2025-11-23 21:16:10.867 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:16:10 np0005532763 nova_compute[231311]: 2025-11-23 21:16:10.867 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:16:10 np0005532763 nova_compute[231311]: 2025-11-23 21:16:10.868 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:16:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:11 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:16:11 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1010952153' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:16:11 np0005532763 nova_compute[231311]: 2025-11-23 21:16:11.365 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:16:11 np0005532763 nova_compute[231311]: 2025-11-23 21:16:11.451 231315 DEBUG nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Nov 23 16:16:11 np0005532763 nova_compute[231311]: 2025-11-23 21:16:11.452 231315 DEBUG nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Nov 23 16:16:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:11.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:11 np0005532763 nova_compute[231311]: 2025-11-23 21:16:11.641 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:16:11 np0005532763 nova_compute[231311]: 2025-11-23 21:16:11.642 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4675MB free_disk=59.89811706542969GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:16:11 np0005532763 nova_compute[231311]: 2025-11-23 21:16:11.643 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:16:11 np0005532763 nova_compute[231311]: 2025-11-23 21:16:11.643 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:16:11 np0005532763 nova_compute[231311]: 2025-11-23 21:16:11.746 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Instance 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 23 16:16:11 np0005532763 nova_compute[231311]: 2025-11-23 21:16:11.746 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:16:11 np0005532763 nova_compute[231311]: 2025-11-23 21:16:11.747 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:16:11 np0005532763 nova_compute[231311]: 2025-11-23 21:16:11.783 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:16:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:16:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:16:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:16:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:16:12 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:16:12 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/62087560' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:16:12 np0005532763 nova_compute[231311]: 2025-11-23 21:16:12.268 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:16:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:12 np0005532763 nova_compute[231311]: 2025-11-23 21:16:12.278 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:16:12 np0005532763 nova_compute[231311]: 2025-11-23 21:16:12.296 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:16:12 np0005532763 nova_compute[231311]: 2025-11-23 21:16:12.329 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:16:12 np0005532763 nova_compute[231311]: 2025-11-23 21:16:12.330 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.687s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:16:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:12.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:12 np0005532763 nova_compute[231311]: 2025-11-23 21:16:12.866 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:16:12 np0005532763 nova_compute[231311]: 2025-11-23 21:16:12.866 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:16:12 np0005532763 nova_compute[231311]: 2025-11-23 21:16:12.867 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:16:12 np0005532763 nova_compute[231311]: 2025-11-23 21:16:12.867 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:16:12 np0005532763 nova_compute[231311]: 2025-11-23 21:16:12.913 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:13 np0005532763 nova_compute[231311]: 2025-11-23 21:16:13.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:16:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:13.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:14 np0005532763 nova_compute[231311]: 2025-11-23 21:16:14.193 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:14.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:16:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:15.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000027s ======
Nov 23 16:16:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:16.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Nov 23 16:16:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:16:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:16:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:16:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:16:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:17.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:17 np0005532763 nova_compute[231311]: 2025-11-23 21:16:17.915 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:18.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:19 np0005532763 nova_compute[231311]: 2025-11-23 21:16:19.196 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:19 np0005532763 nova_compute[231311]: 2025-11-23 21:16:19.344 231315 INFO nova.compute.manager [None req-10ae397d-03d8-48c7-9595-f4181be97b1e 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Get console output#033[00m
Nov 23 16:16:19 np0005532763 nova_compute[231311]: 2025-11-23 21:16:19.349 237838 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 23 16:16:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:19.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:16:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:20 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:20.619 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3a:26:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:d5:4d:db:d5:2b'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:16:20 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:20.620 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 23 16:16:20 np0005532763 nova_compute[231311]: 2025-11-23 21:16:20.622 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:20.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:20 np0005532763 nova_compute[231311]: 2025-11-23 21:16:20.785 231315 DEBUG nova.compute.manager [req-afb66d68-1f29-4dc3-830f-c344b7c9fa84 req-b28f345d-ac24-4f27-b3d2-82b5f765e5f7 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-changed-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:16:20 np0005532763 nova_compute[231311]: 2025-11-23 21:16:20.785 231315 DEBUG nova.compute.manager [req-afb66d68-1f29-4dc3-830f-c344b7c9fa84 req-b28f345d-ac24-4f27-b3d2-82b5f765e5f7 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Refreshing instance network info cache due to event network-changed-311750e0-f35b-4107-a9fb-c1a3c3b4d928. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 23 16:16:20 np0005532763 nova_compute[231311]: 2025-11-23 21:16:20.786 231315 DEBUG oslo_concurrency.lockutils [req-afb66d68-1f29-4dc3-830f-c344b7c9fa84 req-b28f345d-ac24-4f27-b3d2-82b5f765e5f7 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:16:20 np0005532763 nova_compute[231311]: 2025-11-23 21:16:20.786 231315 DEBUG oslo_concurrency.lockutils [req-afb66d68-1f29-4dc3-830f-c344b7c9fa84 req-b28f345d-ac24-4f27-b3d2-82b5f765e5f7 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquired lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:16:20 np0005532763 nova_compute[231311]: 2025-11-23 21:16:20.786 231315 DEBUG nova.network.neutron [req-afb66d68-1f29-4dc3-830f-c344b7c9fa84 req-b28f345d-ac24-4f27-b3d2-82b5f765e5f7 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Refreshing network info cache for port 311750e0-f35b-4107-a9fb-c1a3c3b4d928 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 23 16:16:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:21 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:16:21 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:16:21 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:16:21 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:16:21 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:16:21 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:16:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:21.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:21 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:21.623 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10e3bf57-dd2d-4b94-851f-925bcd297dde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:16:21 np0005532763 nova_compute[231311]: 2025-11-23 21:16:21.798 231315 INFO nova.compute.manager [None req-a368b0d3-00ec-4a02-8cc7-749b23eadd9f 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Get console output#033[00m
Nov 23 16:16:21 np0005532763 nova_compute[231311]: 2025-11-23 21:16:21.806 237838 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 23 16:16:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:16:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:16:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:16:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:16:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000056s ======
Nov 23 16:16:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:22.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Nov 23 16:16:22 np0005532763 nova_compute[231311]: 2025-11-23 21:16:22.952 231315 DEBUG nova.compute.manager [req-3aec81e5-78b9-4eff-8a04-69cc000c6b08 req-fd7f5788-91b0-46f0-a52c-2cdf96a354b1 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-vif-unplugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:16:22 np0005532763 nova_compute[231311]: 2025-11-23 21:16:22.953 231315 DEBUG oslo_concurrency.lockutils [req-3aec81e5-78b9-4eff-8a04-69cc000c6b08 req-fd7f5788-91b0-46f0-a52c-2cdf96a354b1 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:16:22 np0005532763 nova_compute[231311]: 2025-11-23 21:16:22.954 231315 DEBUG oslo_concurrency.lockutils [req-3aec81e5-78b9-4eff-8a04-69cc000c6b08 req-fd7f5788-91b0-46f0-a52c-2cdf96a354b1 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:16:22 np0005532763 nova_compute[231311]: 2025-11-23 21:16:22.955 231315 DEBUG oslo_concurrency.lockutils [req-3aec81e5-78b9-4eff-8a04-69cc000c6b08 req-fd7f5788-91b0-46f0-a52c-2cdf96a354b1 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:16:22 np0005532763 nova_compute[231311]: 2025-11-23 21:16:22.955 231315 DEBUG nova.compute.manager [req-3aec81e5-78b9-4eff-8a04-69cc000c6b08 req-fd7f5788-91b0-46f0-a52c-2cdf96a354b1 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] No waiting events found dispatching network-vif-unplugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:16:22 np0005532763 nova_compute[231311]: 2025-11-23 21:16:22.957 231315 WARNING nova.compute.manager [req-3aec81e5-78b9-4eff-8a04-69cc000c6b08 req-fd7f5788-91b0-46f0-a52c-2cdf96a354b1 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received unexpected event network-vif-unplugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 for instance with vm_state active and task_state None.#033[00m
Nov 23 16:16:22 np0005532763 nova_compute[231311]: 2025-11-23 21:16:22.957 231315 DEBUG nova.compute.manager [req-3aec81e5-78b9-4eff-8a04-69cc000c6b08 req-fd7f5788-91b0-46f0-a52c-2cdf96a354b1 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:16:22 np0005532763 nova_compute[231311]: 2025-11-23 21:16:22.958 231315 DEBUG oslo_concurrency.lockutils [req-3aec81e5-78b9-4eff-8a04-69cc000c6b08 req-fd7f5788-91b0-46f0-a52c-2cdf96a354b1 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:16:22 np0005532763 nova_compute[231311]: 2025-11-23 21:16:22.958 231315 DEBUG oslo_concurrency.lockutils [req-3aec81e5-78b9-4eff-8a04-69cc000c6b08 req-fd7f5788-91b0-46f0-a52c-2cdf96a354b1 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:16:22 np0005532763 nova_compute[231311]: 2025-11-23 21:16:22.958 231315 DEBUG oslo_concurrency.lockutils [req-3aec81e5-78b9-4eff-8a04-69cc000c6b08 req-fd7f5788-91b0-46f0-a52c-2cdf96a354b1 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:16:22 np0005532763 nova_compute[231311]: 2025-11-23 21:16:22.959 231315 DEBUG nova.compute.manager [req-3aec81e5-78b9-4eff-8a04-69cc000c6b08 req-fd7f5788-91b0-46f0-a52c-2cdf96a354b1 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] No waiting events found dispatching network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:16:22 np0005532763 nova_compute[231311]: 2025-11-23 21:16:22.959 231315 WARNING nova.compute.manager [req-3aec81e5-78b9-4eff-8a04-69cc000c6b08 req-fd7f5788-91b0-46f0-a52c-2cdf96a354b1 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received unexpected event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 for instance with vm_state active and task_state None.#033[00m
Nov 23 16:16:22 np0005532763 nova_compute[231311]: 2025-11-23 21:16:22.960 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:23 np0005532763 nova_compute[231311]: 2025-11-23 21:16:23.140 231315 DEBUG nova.network.neutron [req-afb66d68-1f29-4dc3-830f-c344b7c9fa84 req-b28f345d-ac24-4f27-b3d2-82b5f765e5f7 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Updated VIF entry in instance network info cache for port 311750e0-f35b-4107-a9fb-c1a3c3b4d928. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 23 16:16:23 np0005532763 nova_compute[231311]: 2025-11-23 21:16:23.141 231315 DEBUG nova.network.neutron [req-afb66d68-1f29-4dc3-830f-c344b7c9fa84 req-b28f345d-ac24-4f27-b3d2-82b5f765e5f7 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Updating instance_info_cache with network_info: [{"id": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "address": "fa:16:3e:7c:77:fb", "network": {"id": "2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7", "bridge": "br-int", "label": "tempest-network-smoke--943664961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap311750e0-f3", "ovs_interfaceid": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:16:23 np0005532763 nova_compute[231311]: 2025-11-23 21:16:23.159 231315 DEBUG oslo_concurrency.lockutils [req-afb66d68-1f29-4dc3-830f-c344b7c9fa84 req-b28f345d-ac24-4f27-b3d2-82b5f765e5f7 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Releasing lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:16:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:23.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:23 np0005532763 nova_compute[231311]: 2025-11-23 21:16:23.669 231315 INFO nova.compute.manager [None req-e7127d19-9f7e-4895-a6a8-02a427f61c62 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Get console output#033[00m
Nov 23 16:16:23 np0005532763 nova_compute[231311]: 2025-11-23 21:16:23.676 237838 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 23 16:16:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:24 np0005532763 nova_compute[231311]: 2025-11-23 21:16:24.199 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:24.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:16:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:25 np0005532763 nova_compute[231311]: 2025-11-23 21:16:25.093 231315 DEBUG nova.compute.manager [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-changed-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:16:25 np0005532763 nova_compute[231311]: 2025-11-23 21:16:25.094 231315 DEBUG nova.compute.manager [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Refreshing instance network info cache due to event network-changed-311750e0-f35b-4107-a9fb-c1a3c3b4d928. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 23 16:16:25 np0005532763 nova_compute[231311]: 2025-11-23 21:16:25.094 231315 DEBUG oslo_concurrency.lockutils [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:16:25 np0005532763 nova_compute[231311]: 2025-11-23 21:16:25.095 231315 DEBUG oslo_concurrency.lockutils [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquired lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:16:25 np0005532763 nova_compute[231311]: 2025-11-23 21:16:25.095 231315 DEBUG nova.network.neutron [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Refreshing network info cache for port 311750e0-f35b-4107-a9fb-c1a3c3b4d928 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 23 16:16:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:25.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:26 np0005532763 podman[243998]: 2025-11-23 21:16:26.066647717 +0000 UTC m=+0.112301593 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 23 16:16:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:26.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:26 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:16:26 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:16:26 np0005532763 nova_compute[231311]: 2025-11-23 21:16:26.864 231315 DEBUG nova.network.neutron [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Updated VIF entry in instance network info cache for port 311750e0-f35b-4107-a9fb-c1a3c3b4d928. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 23 16:16:26 np0005532763 nova_compute[231311]: 2025-11-23 21:16:26.864 231315 DEBUG nova.network.neutron [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Updating instance_info_cache with network_info: [{"id": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "address": "fa:16:3e:7c:77:fb", "network": {"id": "2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7", "bridge": "br-int", "label": "tempest-network-smoke--943664961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap311750e0-f3", "ovs_interfaceid": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:16:26 np0005532763 nova_compute[231311]: 2025-11-23 21:16:26.877 231315 DEBUG oslo_concurrency.lockutils [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Releasing lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:16:26 np0005532763 nova_compute[231311]: 2025-11-23 21:16:26.878 231315 DEBUG nova.compute.manager [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:16:26 np0005532763 nova_compute[231311]: 2025-11-23 21:16:26.878 231315 DEBUG oslo_concurrency.lockutils [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:16:26 np0005532763 nova_compute[231311]: 2025-11-23 21:16:26.878 231315 DEBUG oslo_concurrency.lockutils [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:16:26 np0005532763 nova_compute[231311]: 2025-11-23 21:16:26.879 231315 DEBUG oslo_concurrency.lockutils [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:16:26 np0005532763 nova_compute[231311]: 2025-11-23 21:16:26.879 231315 DEBUG nova.compute.manager [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] No waiting events found dispatching network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:16:26 np0005532763 nova_compute[231311]: 2025-11-23 21:16:26.880 231315 WARNING nova.compute.manager [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received unexpected event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 for instance with vm_state active and task_state None.#033[00m
Nov 23 16:16:26 np0005532763 nova_compute[231311]: 2025-11-23 21:16:26.880 231315 DEBUG nova.compute.manager [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:16:26 np0005532763 nova_compute[231311]: 2025-11-23 21:16:26.881 231315 DEBUG oslo_concurrency.lockutils [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:16:26 np0005532763 nova_compute[231311]: 2025-11-23 21:16:26.881 231315 DEBUG oslo_concurrency.lockutils [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:16:26 np0005532763 nova_compute[231311]: 2025-11-23 21:16:26.882 231315 DEBUG oslo_concurrency.lockutils [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:16:26 np0005532763 nova_compute[231311]: 2025-11-23 21:16:26.882 231315 DEBUG nova.compute.manager [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] No waiting events found dispatching network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:16:26 np0005532763 nova_compute[231311]: 2025-11-23 21:16:26.882 231315 WARNING nova.compute.manager [req-f7f2338e-2b1a-4861-a4ba-7b8d71a58894 req-4aba5a85-ba4a-49ac-9650-e2d57530a268 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received unexpected event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 for instance with vm_state active and task_state None.#033[00m
Nov 23 16:16:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:16:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:16:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:16:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:16:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:27.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:27 np0005532763 nova_compute[231311]: 2025-11-23 21:16:27.957 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:28.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.101 231315 DEBUG oslo_concurrency.lockutils [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.102 231315 DEBUG oslo_concurrency.lockutils [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.103 231315 DEBUG oslo_concurrency.lockutils [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.103 231315 DEBUG oslo_concurrency.lockutils [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.104 231315 DEBUG oslo_concurrency.lockutils [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.106 231315 INFO nova.compute.manager [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Terminating instance#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.107 231315 DEBUG nova.compute.manager [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 23 16:16:29 np0005532763 kernel: tap311750e0-f3 (unregistering): left promiscuous mode
Nov 23 16:16:29 np0005532763 NetworkManager[48849]: <info>  [1763932589.1640] device (tap311750e0-f3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 23 16:16:29 np0005532763 ovn_controller[133425]: 2025-11-23T21:16:29Z|00074|binding|INFO|Releasing lport 311750e0-f35b-4107-a9fb-c1a3c3b4d928 from this chassis (sb_readonly=0)
Nov 23 16:16:29 np0005532763 ovn_controller[133425]: 2025-11-23T21:16:29Z|00075|binding|INFO|Setting lport 311750e0-f35b-4107-a9fb-c1a3c3b4d928 down in Southbound
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.179 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:29 np0005532763 ovn_controller[133425]: 2025-11-23T21:16:29Z|00076|binding|INFO|Removing iface tap311750e0-f3 ovn-installed in OVS
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.183 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.189 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:77:fb 10.100.0.6'], port_security=['fa:16:3e:7c:77:fb 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782593db60784ab8bff41fe87d72ff5f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0928f1ff-3405-41cf-b57a-21a867de524f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=831ed7cd-9739-4cae-9853-0a7c3c8eb72f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], logical_port=311750e0-f35b-4107-a9fb-c1a3c3b4d928) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.191 142920 INFO neutron.agent.ovn.metadata.agent [-] Port 311750e0-f35b-4107-a9fb-c1a3c3b4d928 in datapath 2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7 unbound from our chassis#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.192 142920 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.195 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[7769dba9-d8f6-4cc3-ba32-3c413f1d7781]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.195 142920 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7 namespace which is not needed anymore#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.202 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.215 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:29 np0005532763 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Nov 23 16:16:29 np0005532763 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000b.scope: Consumed 16.362s CPU time.
Nov 23 16:16:29 np0005532763 systemd-machined[194484]: Machine qemu-6-instance-0000000b terminated.
Nov 23 16:16:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.316 231315 DEBUG nova.compute.manager [req-feee4071-79b5-49e3-8150-391fb71817ee req-1d4b81e0-d2a2-48e4-84de-b59de64830c0 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-changed-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.317 231315 DEBUG nova.compute.manager [req-feee4071-79b5-49e3-8150-391fb71817ee req-1d4b81e0-d2a2-48e4-84de-b59de64830c0 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Refreshing instance network info cache due to event network-changed-311750e0-f35b-4107-a9fb-c1a3c3b4d928. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.317 231315 DEBUG oslo_concurrency.lockutils [req-feee4071-79b5-49e3-8150-391fb71817ee req-1d4b81e0-d2a2-48e4-84de-b59de64830c0 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.318 231315 DEBUG oslo_concurrency.lockutils [req-feee4071-79b5-49e3-8150-391fb71817ee req-1d4b81e0-d2a2-48e4-84de-b59de64830c0 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquired lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.318 231315 DEBUG nova.network.neutron [req-feee4071-79b5-49e3-8150-391fb71817ee req-1d4b81e0-d2a2-48e4-84de-b59de64830c0 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Refreshing network info cache for port 311750e0-f35b-4107-a9fb-c1a3c3b4d928 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 23 16:16:29 np0005532763 kernel: tap311750e0-f3: entered promiscuous mode
Nov 23 16:16:29 np0005532763 ovn_controller[133425]: 2025-11-23T21:16:29Z|00077|binding|INFO|Claiming lport 311750e0-f35b-4107-a9fb-c1a3c3b4d928 for this chassis.
Nov 23 16:16:29 np0005532763 ovn_controller[133425]: 2025-11-23T21:16:29Z|00078|binding|INFO|311750e0-f35b-4107-a9fb-c1a3c3b4d928: Claiming fa:16:3e:7c:77:fb 10.100.0.6
Nov 23 16:16:29 np0005532763 kernel: tap311750e0-f3 (unregistering): left promiscuous mode
Nov 23 16:16:29 np0005532763 NetworkManager[48849]: <info>  [1763932589.3351] manager: (tap311750e0-f3): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.334 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.346 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:77:fb 10.100.0.6'], port_security=['fa:16:3e:7c:77:fb 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782593db60784ab8bff41fe87d72ff5f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0928f1ff-3405-41cf-b57a-21a867de524f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=831ed7cd-9739-4cae-9853-0a7c3c8eb72f, chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], logical_port=311750e0-f35b-4107-a9fb-c1a3c3b4d928) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:16:29 np0005532763 ovn_controller[133425]: 2025-11-23T21:16:29Z|00079|binding|INFO|Setting lport 311750e0-f35b-4107-a9fb-c1a3c3b4d928 ovn-installed in OVS
Nov 23 16:16:29 np0005532763 ovn_controller[133425]: 2025-11-23T21:16:29Z|00080|binding|INFO|Setting lport 311750e0-f35b-4107-a9fb-c1a3c3b4d928 up in Southbound
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.365 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:29 np0005532763 ovn_controller[133425]: 2025-11-23T21:16:29Z|00081|binding|INFO|Releasing lport 311750e0-f35b-4107-a9fb-c1a3c3b4d928 from this chassis (sb_readonly=1)
Nov 23 16:16:29 np0005532763 ovn_controller[133425]: 2025-11-23T21:16:29Z|00082|if_status|INFO|Not setting lport 311750e0-f35b-4107-a9fb-c1a3c3b4d928 down as sb is readonly
Nov 23 16:16:29 np0005532763 ovn_controller[133425]: 2025-11-23T21:16:29Z|00083|binding|INFO|Removing iface tap311750e0-f3 ovn-installed in OVS
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.370 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:29 np0005532763 ovn_controller[133425]: 2025-11-23T21:16:29Z|00084|binding|INFO|Releasing lport 311750e0-f35b-4107-a9fb-c1a3c3b4d928 from this chassis (sb_readonly=0)
Nov 23 16:16:29 np0005532763 ovn_controller[133425]: 2025-11-23T21:16:29Z|00085|binding|INFO|Setting lport 311750e0-f35b-4107-a9fb-c1a3c3b4d928 down in Southbound
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.376 231315 INFO nova.virt.libvirt.driver [-] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Instance destroyed successfully.#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.376 231315 DEBUG nova.objects.instance [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lazy-loading 'resources' on Instance uuid 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.379 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:77:fb 10.100.0.6'], port_security=['fa:16:3e:7c:77:fb 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '782593db60784ab8bff41fe87d72ff5f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0928f1ff-3405-41cf-b57a-21a867de524f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=831ed7cd-9739-4cae-9853-0a7c3c8eb72f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>], logical_port=311750e0-f35b-4107-a9fb-c1a3c3b4d928) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7a373748b0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.385 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.390 231315 DEBUG nova.virt.libvirt.vif [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-23T21:15:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1078764307',display_name='tempest-TestNetworkBasicOps-server-1078764307',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1078764307',id=11,image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBN0kySXjrbMEpZa+5DFuh5NyWqVIVYSWPP7YdIvlLi4UKs7COQcUL8O1z0kJF5qaLUlgiXSen4gzGNy2fKqXGEebH9mjccMVNm22QV/7ekbOLUHkkHW6JYEeXM6zTKwGw==',key_name='tempest-TestNetworkBasicOps-557128906',keypairs=<?>,launch_index=0,launched_at=2025-11-23T21:15:35Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='782593db60784ab8bff41fe87d72ff5f',ramdisk_id='',reservation_id='r-qt7qhyho',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c45fa6c-8a99-4359-a34e-d89f4e1e77d0',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1975357669',owner_user_name='tempest-TestNetworkBasicOps-1975357669-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-23T21:15:35Z,user_data=None,user_id='9fb5352c62684f2ba3a326a953a10dfe',uuid=8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "address": "fa:16:3e:7c:77:fb", "network": {"id": "2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7", "bridge": "br-int", "label": "tempest-network-smoke--943664961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap311750e0-f3", "ovs_interfaceid": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.391 231315 DEBUG nova.network.os_vif_util [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converting VIF {"id": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "address": "fa:16:3e:7c:77:fb", "network": {"id": "2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7", "bridge": "br-int", "label": "tempest-network-smoke--943664961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap311750e0-f3", "ovs_interfaceid": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.392 231315 DEBUG nova.network.os_vif_util [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:77:fb,bridge_name='br-int',has_traffic_filtering=True,id=311750e0-f35b-4107-a9fb-c1a3c3b4d928,network=Network(2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap311750e0-f3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.393 231315 DEBUG os_vif [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:77:fb,bridge_name='br-int',has_traffic_filtering=True,id=311750e0-f35b-4107-a9fb-c1a3c3b4d928,network=Network(2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap311750e0-f3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.396 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.396 231315 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap311750e0-f3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.399 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:29 np0005532763 neutron-haproxy-ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7[243663]: [NOTICE]   (243667) : haproxy version is 2.8.14-c23fe91
Nov 23 16:16:29 np0005532763 neutron-haproxy-ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7[243663]: [NOTICE]   (243667) : path to executable is /usr/sbin/haproxy
Nov 23 16:16:29 np0005532763 neutron-haproxy-ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7[243663]: [WARNING]  (243667) : Exiting Master process...
Nov 23 16:16:29 np0005532763 neutron-haproxy-ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7[243663]: [WARNING]  (243667) : Exiting Master process...
Nov 23 16:16:29 np0005532763 neutron-haproxy-ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7[243663]: [ALERT]    (243667) : Current worker (243669) exited with code 143 (Terminated)
Nov 23 16:16:29 np0005532763 neutron-haproxy-ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7[243663]: [WARNING]  (243667) : All workers exited. Exiting... (0)
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.403 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 23 16:16:29 np0005532763 systemd[1]: libpod-6b1a8af9f52464e924aca9da73d730b9a7daf4123738fbffbd21f97be809a739.scope: Deactivated successfully.
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.407 231315 INFO os_vif [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:77:fb,bridge_name='br-int',has_traffic_filtering=True,id=311750e0-f35b-4107-a9fb-c1a3c3b4d928,network=Network(2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap311750e0-f3')#033[00m
Nov 23 16:16:29 np0005532763 podman[244070]: 2025-11-23 21:16:29.41280351 +0000 UTC m=+0.078196365 container died 6b1a8af9f52464e924aca9da73d730b9a7daf4123738fbffbd21f97be809a739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 16:16:29 np0005532763 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b1a8af9f52464e924aca9da73d730b9a7daf4123738fbffbd21f97be809a739-userdata-shm.mount: Deactivated successfully.
Nov 23 16:16:29 np0005532763 systemd[1]: var-lib-containers-storage-overlay-e220df071e0c0b64bb122b7ea991beef96f9caca80f47a8e59e0ae62e18bb522-merged.mount: Deactivated successfully.
Nov 23 16:16:29 np0005532763 podman[244070]: 2025-11-23 21:16:29.477779904 +0000 UTC m=+0.143172769 container cleanup 6b1a8af9f52464e924aca9da73d730b9a7daf4123738fbffbd21f97be809a739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 23 16:16:29 np0005532763 systemd[1]: libpod-conmon-6b1a8af9f52464e924aca9da73d730b9a7daf4123738fbffbd21f97be809a739.scope: Deactivated successfully.
Nov 23 16:16:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:29.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:29 np0005532763 podman[244121]: 2025-11-23 21:16:29.583911542 +0000 UTC m=+0.069170442 container remove 6b1a8af9f52464e924aca9da73d730b9a7daf4123738fbffbd21f97be809a739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.594 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[b19b7deb-4a3e-4773-af2a-4839c6c8252c]: (4, ('Sun Nov 23 09:16:29 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7 (6b1a8af9f52464e924aca9da73d730b9a7daf4123738fbffbd21f97be809a739)\n6b1a8af9f52464e924aca9da73d730b9a7daf4123738fbffbd21f97be809a739\nSun Nov 23 09:16:29 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7 (6b1a8af9f52464e924aca9da73d730b9a7daf4123738fbffbd21f97be809a739)\n6b1a8af9f52464e924aca9da73d730b9a7daf4123738fbffbd21f97be809a739\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.596 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[8bb298d9-1e32-4571-bc4e-be00ce85c85c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.597 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2b2cbb2b-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.600 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:29 np0005532763 kernel: tap2b2cbb2b-40: left promiscuous mode
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.627 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.632 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[2946b166-f725-4a59-845b-d95f935cb56c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.648 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[f8f45403-e890-4bd8-93db-d5bfd870e3db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.651 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[caffdf52-943b-441a-b076-98a2ab77aca5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.674 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[a0843578-8a27-4055-be09-c845e29172ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 448465, 'reachable_time': 40752, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244135, 'error': None, 'target': 'ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.678 143034 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.679 143034 DEBUG oslo.privsep.daemon [-] privsep: reply[934f841b-b81d-4ac7-bdf2-b801b19b7301]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:16:29 np0005532763 systemd[1]: run-netns-ovnmeta\x2d2b2cbb2b\x2d4635\x2d48f6\x2d97b3\x2db4c96d1d06f7.mount: Deactivated successfully.
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.680 142920 INFO neutron.agent.ovn.metadata.agent [-] Port 311750e0-f35b-4107-a9fb-c1a3c3b4d928 in datapath 2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7 unbound from our chassis#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.682 142920 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.683 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[fe2d5371-3df2-4ac1-a4bf-53c24e83eb43]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.684 142920 INFO neutron.agent.ovn.metadata.agent [-] Port 311750e0-f35b-4107-a9fb-c1a3c3b4d928 in datapath 2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7 unbound from our chassis#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.686 142920 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 23 16:16:29 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:29.686 235389 DEBUG oslo.privsep.daemon [-] privsep: reply[c960b4dc-b9d4-4ec8-822b-5e283a3bcf43]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.897 231315 INFO nova.virt.libvirt.driver [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Deleting instance files /var/lib/nova/instances/8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0_del#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.898 231315 INFO nova.virt.libvirt.driver [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Deletion of /var/lib/nova/instances/8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0_del complete#033[00m
Nov 23 16:16:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.966 231315 INFO nova.compute.manager [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Took 0.86 seconds to destroy the instance on the hypervisor.#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.967 231315 DEBUG oslo.service.loopingcall [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.967 231315 DEBUG nova.compute.manager [-] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 23 16:16:29 np0005532763 nova_compute[231311]: 2025-11-23 21:16:29.967 231315 DEBUG nova.network.neutron [-] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 23 16:16:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:30.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.158 231315 DEBUG nova.network.neutron [req-feee4071-79b5-49e3-8150-391fb71817ee req-1d4b81e0-d2a2-48e4-84de-b59de64830c0 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Updated VIF entry in instance network info cache for port 311750e0-f35b-4107-a9fb-c1a3c3b4d928. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.158 231315 DEBUG nova.network.neutron [req-feee4071-79b5-49e3-8150-391fb71817ee req-1d4b81e0-d2a2-48e4-84de-b59de64830c0 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Updating instance_info_cache with network_info: [{"id": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "address": "fa:16:3e:7c:77:fb", "network": {"id": "2b2cbb2b-4635-48f6-97b3-b4c96d1d06f7", "bridge": "br-int", "label": "tempest-network-smoke--943664961", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "782593db60784ab8bff41fe87d72ff5f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap311750e0-f3", "ovs_interfaceid": "311750e0-f35b-4107-a9fb-c1a3c3b4d928", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.173 231315 DEBUG oslo_concurrency.lockutils [req-feee4071-79b5-49e3-8150-391fb71817ee req-1d4b81e0-d2a2-48e4-84de-b59de64830c0 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Releasing lock "refresh_cache-8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 23 16:16:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.372 231315 DEBUG nova.network.neutron [-] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.386 231315 DEBUG nova.compute.manager [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-vif-unplugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.387 231315 DEBUG oslo_concurrency.lockutils [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.387 231315 DEBUG oslo_concurrency.lockutils [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.388 231315 DEBUG oslo_concurrency.lockutils [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.388 231315 DEBUG nova.compute.manager [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] No waiting events found dispatching network-vif-unplugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.389 231315 DEBUG nova.compute.manager [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-vif-unplugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.389 231315 DEBUG nova.compute.manager [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.390 231315 DEBUG oslo_concurrency.lockutils [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.390 231315 DEBUG oslo_concurrency.lockutils [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.390 231315 DEBUG oslo_concurrency.lockutils [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.391 231315 DEBUG nova.compute.manager [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] No waiting events found dispatching network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.391 231315 WARNING nova.compute.manager [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received unexpected event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 for instance with vm_state active and task_state deleting.#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.392 231315 DEBUG nova.compute.manager [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.392 231315 DEBUG oslo_concurrency.lockutils [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.393 231315 DEBUG oslo_concurrency.lockutils [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.393 231315 DEBUG oslo_concurrency.lockutils [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.393 231315 DEBUG nova.compute.manager [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] No waiting events found dispatching network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.394 231315 WARNING nova.compute.manager [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received unexpected event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 for instance with vm_state active and task_state deleting.#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.394 231315 DEBUG nova.compute.manager [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.395 231315 DEBUG oslo_concurrency.lockutils [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.395 231315 DEBUG oslo_concurrency.lockutils [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.395 231315 DEBUG oslo_concurrency.lockutils [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.396 231315 DEBUG nova.compute.manager [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] No waiting events found dispatching network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.396 231315 WARNING nova.compute.manager [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received unexpected event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 for instance with vm_state active and task_state deleting.#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.397 231315 DEBUG nova.compute.manager [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-vif-unplugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.397 231315 DEBUG oslo_concurrency.lockutils [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.397 231315 DEBUG oslo_concurrency.lockutils [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.398 231315 DEBUG oslo_concurrency.lockutils [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.398 231315 DEBUG nova.compute.manager [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] No waiting events found dispatching network-vif-unplugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.399 231315 DEBUG nova.compute.manager [req-8d5d61ad-1a6d-42cd-a0ad-6735a16277a1 req-47b7067b-744b-4eda-8f44-95e27d938f61 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-vif-unplugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.400 231315 INFO nova.compute.manager [-] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Took 1.43 seconds to deallocate network for instance.#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.439 231315 DEBUG oslo_concurrency.lockutils [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.440 231315 DEBUG oslo_concurrency.lockutils [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:16:31 np0005532763 nova_compute[231311]: 2025-11-23 21:16:31.497 231315 DEBUG oslo_concurrency.processutils [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:16:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:31.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:31 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:16:31 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2926754021' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:16:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:16:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:16:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:16:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:16:32 np0005532763 nova_compute[231311]: 2025-11-23 21:16:32.005 231315 DEBUG oslo_concurrency.processutils [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:16:32 np0005532763 nova_compute[231311]: 2025-11-23 21:16:32.014 231315 DEBUG nova.compute.provider_tree [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:16:32 np0005532763 nova_compute[231311]: 2025-11-23 21:16:32.037 231315 DEBUG nova.scheduler.client.report [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:16:32 np0005532763 nova_compute[231311]: 2025-11-23 21:16:32.064 231315 DEBUG oslo_concurrency.lockutils [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.624s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:16:32 np0005532763 nova_compute[231311]: 2025-11-23 21:16:32.088 231315 INFO nova.scheduler.client.report [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Deleted allocations for instance 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0#033[00m
Nov 23 16:16:32 np0005532763 nova_compute[231311]: 2025-11-23 21:16:32.204 231315 DEBUG oslo_concurrency.lockutils [None req-1a0b1c0c-9f0d-458b-9c70-e04db136cdb8 9fb5352c62684f2ba3a326a953a10dfe 782593db60784ab8bff41fe87d72ff5f - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:16:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:32.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:32 np0005532763 nova_compute[231311]: 2025-11-23 21:16:32.983 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:33 np0005532763 nova_compute[231311]: 2025-11-23 21:16:33.499 231315 DEBUG nova.compute.manager [req-00551055-d152-40c2-a3fc-7efd6ca120c9 req-b763852e-cbb9-42ec-9700-0364294272bf 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:16:33 np0005532763 nova_compute[231311]: 2025-11-23 21:16:33.500 231315 DEBUG oslo_concurrency.lockutils [req-00551055-d152-40c2-a3fc-7efd6ca120c9 req-b763852e-cbb9-42ec-9700-0364294272bf 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Acquiring lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:16:33 np0005532763 nova_compute[231311]: 2025-11-23 21:16:33.500 231315 DEBUG oslo_concurrency.lockutils [req-00551055-d152-40c2-a3fc-7efd6ca120c9 req-b763852e-cbb9-42ec-9700-0364294272bf 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:16:33 np0005532763 nova_compute[231311]: 2025-11-23 21:16:33.500 231315 DEBUG oslo_concurrency.lockutils [req-00551055-d152-40c2-a3fc-7efd6ca120c9 req-b763852e-cbb9-42ec-9700-0364294272bf 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] Lock "8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:16:33 np0005532763 nova_compute[231311]: 2025-11-23 21:16:33.501 231315 DEBUG nova.compute.manager [req-00551055-d152-40c2-a3fc-7efd6ca120c9 req-b763852e-cbb9-42ec-9700-0364294272bf 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] No waiting events found dispatching network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 23 16:16:33 np0005532763 nova_compute[231311]: 2025-11-23 21:16:33.501 231315 WARNING nova.compute.manager [req-00551055-d152-40c2-a3fc-7efd6ca120c9 req-b763852e-cbb9-42ec-9700-0364294272bf 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received unexpected event network-vif-plugged-311750e0-f35b-4107-a9fb-c1a3c3b4d928 for instance with vm_state deleted and task_state None.#033[00m
Nov 23 16:16:33 np0005532763 nova_compute[231311]: 2025-11-23 21:16:33.501 231315 DEBUG nova.compute.manager [req-00551055-d152-40c2-a3fc-7efd6ca120c9 req-b763852e-cbb9-42ec-9700-0364294272bf 30cdb2d57472403e887bb57c5aa3b413 a3a7655dfdb941daaf17f7f95d16950a - - default default] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Received event network-vif-deleted-311750e0-f35b-4107-a9fb-c1a3c3b4d928 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 23 16:16:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:33.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:34 np0005532763 nova_compute[231311]: 2025-11-23 21:16:34.401 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:34.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:16:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:35 np0005532763 podman[244165]: 2025-11-23 21:16:35.221466444 +0000 UTC m=+0.097266649 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 23 16:16:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:35 np0005532763 podman[244166]: 2025-11-23 21:16:35.30256674 +0000 UTC m=+0.174052654 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 23 16:16:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:35.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:36 np0005532763 nova_compute[231311]: 2025-11-23 21:16:36.121 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:36 np0005532763 nova_compute[231311]: 2025-11-23 21:16:36.211 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:36.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:16:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:16:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:16:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:16:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:37.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:37 np0005532763 nova_compute[231311]: 2025-11-23 21:16:37.985 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:38.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:39 np0005532763 nova_compute[231311]: 2025-11-23 21:16:39.405 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:39.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:16:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:40.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:41.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:16:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:16:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:16:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:16:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:42.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:43 np0005532763 nova_compute[231311]: 2025-11-23 21:16:43.010 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:43.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:44 np0005532763 nova_compute[231311]: 2025-11-23 21:16:44.373 231315 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763932589.3719833, 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 23 16:16:44 np0005532763 nova_compute[231311]: 2025-11-23 21:16:44.374 231315 INFO nova.compute.manager [-] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] VM Stopped (Lifecycle Event)#033[00m
Nov 23 16:16:44 np0005532763 nova_compute[231311]: 2025-11-23 21:16:44.398 231315 DEBUG nova.compute.manager [None req-b50ffe6a-a390-4a96-b678-817c4d086745 - - - - - -] [instance: 8a2d7a44-ab53-4f48-a9f3-cec2df97a7c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 23 16:16:44 np0005532763 nova_compute[231311]: 2025-11-23 21:16:44.409 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:44.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:16:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:45.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:46.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:16:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:16:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:16:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:16:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:47.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:48 np0005532763 nova_compute[231311]: 2025-11-23 21:16:48.053 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:48.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:49 np0005532763 nova_compute[231311]: 2025-11-23 21:16:49.412 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:49.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:16:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:16:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:50.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:16:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:51.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:16:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:16:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:16:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:16:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:52.232 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:16:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:52.233 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:16:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:16:52.233 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:16:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:52.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:53 np0005532763 nova_compute[231311]: 2025-11-23 21:16:53.062 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:53.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:54 np0005532763 nova_compute[231311]: 2025-11-23 21:16:54.416 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:54.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:16:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:55.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:56 np0005532763 podman[244258]: 2025-11-23 21:16:56.239813062 +0000 UTC m=+0.117194710 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 23 16:16:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:16:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:56.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:16:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:16:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:16:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:16:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:16:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:16:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:57.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:58 np0005532763 nova_compute[231311]: 2025-11-23 21:16:58.089 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:16:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:16:58.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:16:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:16:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:16:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:16:59 np0005532763 nova_compute[231311]: 2025-11-23 21:16:59.419 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:16:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:16:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:16:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:16:59.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:16:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:17:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:00.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:01.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:17:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:17:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:17:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:17:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:02.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:03 np0005532763 nova_compute[231311]: 2025-11-23 21:17:03.106 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:03.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:04 np0005532763 nova_compute[231311]: 2025-11-23 21:17:04.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:17:04 np0005532763 nova_compute[231311]: 2025-11-23 21:17:04.423 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:04.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:17:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:05.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:06 np0005532763 podman[244287]: 2025-11-23 21:17:06.232797086 +0000 UTC m=+0.099910875 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 16:17:06 np0005532763 podman[244288]: 2025-11-23 21:17:06.272198321 +0000 UTC m=+0.136605554 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller)
Nov 23 16:17:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:06.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:17:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:17:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:17:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:17:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:07 np0005532763 nova_compute[231311]: 2025-11-23 21:17:07.391 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:17:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:17:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:07.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:17:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:08 np0005532763 nova_compute[231311]: 2025-11-23 21:17:08.136 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:08.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:09 np0005532763 nova_compute[231311]: 2025-11-23 21:17:09.379 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:17:09 np0005532763 nova_compute[231311]: 2025-11-23 21:17:09.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:17:09 np0005532763 nova_compute[231311]: 2025-11-23 21:17:09.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:17:09 np0005532763 nova_compute[231311]: 2025-11-23 21:17:09.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:17:09 np0005532763 nova_compute[231311]: 2025-11-23 21:17:09.398 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:17:09 np0005532763 nova_compute[231311]: 2025-11-23 21:17:09.398 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:17:09 np0005532763 nova_compute[231311]: 2025-11-23 21:17:09.399 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:17:09 np0005532763 nova_compute[231311]: 2025-11-23 21:17:09.433 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:09.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:17:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:10.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:11.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:17:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:17:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:17:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:17:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:12 np0005532763 nova_compute[231311]: 2025-11-23 21:17:12.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:17:12 np0005532763 nova_compute[231311]: 2025-11-23 21:17:12.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:17:12 np0005532763 nova_compute[231311]: 2025-11-23 21:17:12.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:17:12 np0005532763 nova_compute[231311]: 2025-11-23 21:17:12.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:17:12 np0005532763 nova_compute[231311]: 2025-11-23 21:17:12.403 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:17:12 np0005532763 nova_compute[231311]: 2025-11-23 21:17:12.405 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:17:12 np0005532763 nova_compute[231311]: 2025-11-23 21:17:12.405 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:17:12 np0005532763 nova_compute[231311]: 2025-11-23 21:17:12.405 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:17:12 np0005532763 nova_compute[231311]: 2025-11-23 21:17:12.406 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:17:12 np0005532763 ovn_controller[133425]: 2025-11-23T21:17:12Z|00086|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 23 16:17:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:12.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:12 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:17:12 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2950651412' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:17:12 np0005532763 nova_compute[231311]: 2025-11-23 21:17:12.890 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:17:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:13 np0005532763 nova_compute[231311]: 2025-11-23 21:17:13.165 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:13 np0005532763 nova_compute[231311]: 2025-11-23 21:17:13.193 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:17:13 np0005532763 nova_compute[231311]: 2025-11-23 21:17:13.195 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4855MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:17:13 np0005532763 nova_compute[231311]: 2025-11-23 21:17:13.196 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:17:13 np0005532763 nova_compute[231311]: 2025-11-23 21:17:13.196 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:17:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:13 np0005532763 nova_compute[231311]: 2025-11-23 21:17:13.323 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:17:13 np0005532763 nova_compute[231311]: 2025-11-23 21:17:13.323 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:17:13 np0005532763 nova_compute[231311]: 2025-11-23 21:17:13.384 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:17:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:13.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:13 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:17:13 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2299030320' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:17:13 np0005532763 nova_compute[231311]: 2025-11-23 21:17:13.901 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:17:13 np0005532763 nova_compute[231311]: 2025-11-23 21:17:13.910 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:17:13 np0005532763 nova_compute[231311]: 2025-11-23 21:17:13.925 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:17:13 np0005532763 nova_compute[231311]: 2025-11-23 21:17:13.953 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:17:13 np0005532763 nova_compute[231311]: 2025-11-23 21:17:13.954 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.758s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:17:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:14 np0005532763 nova_compute[231311]: 2025-11-23 21:17:14.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:17:14 np0005532763 nova_compute[231311]: 2025-11-23 21:17:14.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 23 16:17:14 np0005532763 nova_compute[231311]: 2025-11-23 21:17:14.397 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 23 16:17:14 np0005532763 nova_compute[231311]: 2025-11-23 21:17:14.437 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:14.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:17:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:15 np0005532763 nova_compute[231311]: 2025-11-23 21:17:15.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:17:15 np0005532763 nova_compute[231311]: 2025-11-23 21:17:15.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:17:15 np0005532763 nova_compute[231311]: 2025-11-23 21:17:15.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 23 16:17:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:15.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:16.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:17:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:17:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:17:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:17:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:17.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:18 np0005532763 nova_compute[231311]: 2025-11-23 21:17:18.211 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:18.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:19 np0005532763 nova_compute[231311]: 2025-11-23 21:17:19.466 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:19.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:17:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:20.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:21.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:17:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:17:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:17:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:17:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:17:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:22.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:17:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:23 np0005532763 nova_compute[231311]: 2025-11-23 21:17:23.255 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:23.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:24 np0005532763 nova_compute[231311]: 2025-11-23 21:17:24.470 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:24.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:17:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:25.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:26.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:17:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:17:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:17:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:17:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:27 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:17:27.029 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3a:26:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:d5:4d:db:d5:2b'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:17:27 np0005532763 nova_compute[231311]: 2025-11-23 21:17:27.029 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:27 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:17:27.032 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 23 16:17:27 np0005532763 podman[244528]: 2025-11-23 21:17:27.219998743 +0000 UTC m=+0.091879820 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 16:17:27 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:17:27 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:17:27 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:17:27 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:17:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:27.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:28 np0005532763 nova_compute[231311]: 2025-11-23 21:17:28.295 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:28.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:29 np0005532763 nova_compute[231311]: 2025-11-23 21:17:29.472 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:29.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:17:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:30.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:31.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:17:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:17:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:17:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:17:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:32 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:17:32 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:17:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:32.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:33 np0005532763 nova_compute[231311]: 2025-11-23 21:17:33.334 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:17:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:33.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:17:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:34 np0005532763 nova_compute[231311]: 2025-11-23 21:17:34.475 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:34.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:17:34 np0005532763 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 23 16:17:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:35 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:17:35.035 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10e3bf57-dd2d-4b94-851f-925bcd297dde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:17:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:35.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:36.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:17:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:17:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:17:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:17:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:37 np0005532763 podman[244586]: 2025-11-23 21:17:37.21062998 +0000 UTC m=+0.089743630 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Nov 23 16:17:37 np0005532763 podman[244587]: 2025-11-23 21:17:37.255500769 +0000 UTC m=+0.125475112 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 16:17:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:37.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:38 np0005532763 nova_compute[231311]: 2025-11-23 21:17:38.368 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:38.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:39 np0005532763 nova_compute[231311]: 2025-11-23 21:17:39.522 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:39.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:17:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:40.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:41.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:17:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:17:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:17:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:17:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:42.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:43 np0005532763 nova_compute[231311]: 2025-11-23 21:17:43.417 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:43.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:44 np0005532763 nova_compute[231311]: 2025-11-23 21:17:44.554 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:44.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:17:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:45.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:46.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:17:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:17:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:17:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:17:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:47.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:48 np0005532763 nova_compute[231311]: 2025-11-23 21:17:48.422 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:48.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:49 np0005532763 nova_compute[231311]: 2025-11-23 21:17:49.557 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:49.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:17:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:50.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:51.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:17:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:17:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:17:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:17:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:17:52.233 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:17:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:17:52.234 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:17:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:17:52.234 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:17:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:52.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:53 np0005532763 nova_compute[231311]: 2025-11-23 21:17:53.424 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:53.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:54 np0005532763 nova_compute[231311]: 2025-11-23 21:17:54.604 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:54.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:17:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:55.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:56.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:17:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:17:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:17:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:17:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:17:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:57.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:58 np0005532763 podman[244679]: 2025-11-23 21:17:58.206449851 +0000 UTC m=+0.076191529 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 23 16:17:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:58 np0005532763 nova_compute[231311]: 2025-11-23 21:17:58.464 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:17:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:17:58.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:17:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:17:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:17:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:17:59 np0005532763 nova_compute[231311]: 2025-11-23 21:17:59.631 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:17:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:17:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:17:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:17:59.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:17:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:18:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:00.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:01.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:18:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:18:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:18:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:18:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:02.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:03 np0005532763 nova_compute[231311]: 2025-11-23 21:18:03.464 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:18:03.576261) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932683576540, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2377, "num_deletes": 251, "total_data_size": 6373867, "memory_usage": 6458240, "flush_reason": "Manual Compaction"}
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932683598019, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4091898, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31454, "largest_seqno": 33826, "table_properties": {"data_size": 4082334, "index_size": 6058, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19859, "raw_average_key_size": 20, "raw_value_size": 4063205, "raw_average_value_size": 4184, "num_data_blocks": 261, "num_entries": 971, "num_filter_entries": 971, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763932480, "oldest_key_time": 1763932480, "file_creation_time": 1763932683, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 21796 microseconds, and 14897 cpu microseconds.
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:18:03.598235) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4091898 bytes OK
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:18:03.598379) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:18:03.600676) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:18:03.600701) EVENT_LOG_v1 {"time_micros": 1763932683600693, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:18:03.600733) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6363493, prev total WAL file size 6363493, number of live WAL files 2.
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:18:03.603954) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(3995KB)], [60(12MB)]
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932683604001, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 16732115, "oldest_snapshot_seqno": -1}
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6233 keys, 14606009 bytes, temperature: kUnknown
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932683684252, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 14606009, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14564782, "index_size": 24541, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15621, "raw_key_size": 159776, "raw_average_key_size": 25, "raw_value_size": 14453075, "raw_average_value_size": 2318, "num_data_blocks": 987, "num_entries": 6233, "num_filter_entries": 6233, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763932683, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:18:03.684984) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 14606009 bytes
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:18:03.686628) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 208.2 rd, 181.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 12.1 +0.0 blob) out(13.9 +0.0 blob), read-write-amplify(7.7) write-amplify(3.6) OK, records in: 6754, records dropped: 521 output_compression: NoCompression
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:18:03.686660) EVENT_LOG_v1 {"time_micros": 1763932683686646, "job": 36, "event": "compaction_finished", "compaction_time_micros": 80350, "compaction_time_cpu_micros": 52451, "output_level": 6, "num_output_files": 1, "total_output_size": 14606009, "num_input_records": 6754, "num_output_records": 6233, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932683688088, "job": 36, "event": "table_file_deletion", "file_number": 62}
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932683692422, "job": 36, "event": "table_file_deletion", "file_number": 60}
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:18:03.603847) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:18:03.692529) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:18:03.692537) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:18:03.692540) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:18:03.692543) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:18:03 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:18:03.692545) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:18:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:03.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:04 np0005532763 nova_compute[231311]: 2025-11-23 21:18:04.667 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:04.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:18:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:05.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:06 np0005532763 nova_compute[231311]: 2025-11-23 21:18:06.393 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:18:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:06.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:18:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:18:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:18:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:18:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:07 np0005532763 podman[244733]: 2025-11-23 21:18:07.468601927 +0000 UTC m=+0.095909152 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 23 16:18:07 np0005532763 podman[244734]: 2025-11-23 21:18:07.510664537 +0000 UTC m=+0.132557380 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 16:18:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:07.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:18:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2931937022' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:18:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:18:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2931937022' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:18:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:08 np0005532763 nova_compute[231311]: 2025-11-23 21:18:08.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:18:08 np0005532763 nova_compute[231311]: 2025-11-23 21:18:08.500 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:08.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:09 np0005532763 nova_compute[231311]: 2025-11-23 21:18:09.711 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:18:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:09.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:18:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:18:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:10 np0005532763 nova_compute[231311]: 2025-11-23 21:18:10.380 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:18:10 np0005532763 nova_compute[231311]: 2025-11-23 21:18:10.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:18:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:10.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:11 np0005532763 nova_compute[231311]: 2025-11-23 21:18:11.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:18:11 np0005532763 nova_compute[231311]: 2025-11-23 21:18:11.385 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:18:11 np0005532763 nova_compute[231311]: 2025-11-23 21:18:11.385 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:18:11 np0005532763 nova_compute[231311]: 2025-11-23 21:18:11.398 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:18:11 np0005532763 nova_compute[231311]: 2025-11-23 21:18:11.399 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:18:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:11.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:18:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:18:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:18:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:18:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:12 np0005532763 nova_compute[231311]: 2025-11-23 21:18:12.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:18:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:12.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:13 np0005532763 nova_compute[231311]: 2025-11-23 21:18:13.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:18:13 np0005532763 nova_compute[231311]: 2025-11-23 21:18:13.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:18:13 np0005532763 nova_compute[231311]: 2025-11-23 21:18:13.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:18:13 np0005532763 nova_compute[231311]: 2025-11-23 21:18:13.496 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:18:13 np0005532763 nova_compute[231311]: 2025-11-23 21:18:13.497 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:18:13 np0005532763 nova_compute[231311]: 2025-11-23 21:18:13.497 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:18:13 np0005532763 nova_compute[231311]: 2025-11-23 21:18:13.498 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:18:13 np0005532763 nova_compute[231311]: 2025-11-23 21:18:13.498 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:18:13 np0005532763 nova_compute[231311]: 2025-11-23 21:18:13.525 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:13.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:13 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:18:13 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2575718697' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:18:13 np0005532763 nova_compute[231311]: 2025-11-23 21:18:13.986 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:18:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:14 np0005532763 nova_compute[231311]: 2025-11-23 21:18:14.240 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:18:14 np0005532763 nova_compute[231311]: 2025-11-23 21:18:14.242 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4873MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:18:14 np0005532763 nova_compute[231311]: 2025-11-23 21:18:14.242 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:18:14 np0005532763 nova_compute[231311]: 2025-11-23 21:18:14.242 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:18:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:14 np0005532763 nova_compute[231311]: 2025-11-23 21:18:14.574 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:18:14 np0005532763 nova_compute[231311]: 2025-11-23 21:18:14.575 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:18:14 np0005532763 nova_compute[231311]: 2025-11-23 21:18:14.609 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Refreshing inventories for resource provider 20c32e0a-de2c-427c-9273-fac11e2660f4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 23 16:18:14 np0005532763 nova_compute[231311]: 2025-11-23 21:18:14.637 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Updating ProviderTree inventory for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 23 16:18:14 np0005532763 nova_compute[231311]: 2025-11-23 21:18:14.637 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Updating inventory in ProviderTree for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 23 16:18:14 np0005532763 nova_compute[231311]: 2025-11-23 21:18:14.709 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Refreshing aggregate associations for resource provider 20c32e0a-de2c-427c-9273-fac11e2660f4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 23 16:18:14 np0005532763 nova_compute[231311]: 2025-11-23 21:18:14.713 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:14 np0005532763 nova_compute[231311]: 2025-11-23 21:18:14.764 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Refreshing trait associations for resource provider 20c32e0a-de2c-427c-9273-fac11e2660f4, traits: COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AVX2,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_FMA3,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,HW_CPU_X86_SVM,HW_CPU_X86_BMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 23 16:18:14 np0005532763 nova_compute[231311]: 2025-11-23 21:18:14.788 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:18:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:14.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:18:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:18:15 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3445667768' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:18:15 np0005532763 nova_compute[231311]: 2025-11-23 21:18:15.254 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:18:15 np0005532763 nova_compute[231311]: 2025-11-23 21:18:15.259 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:18:15 np0005532763 nova_compute[231311]: 2025-11-23 21:18:15.271 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:18:15 np0005532763 nova_compute[231311]: 2025-11-23 21:18:15.272 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:18:15 np0005532763 nova_compute[231311]: 2025-11-23 21:18:15.273 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.030s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:18:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:15.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:16.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:18:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:18:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:18:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:18:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:17 np0005532763 nova_compute[231311]: 2025-11-23 21:18:17.274 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:18:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:17.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:18 np0005532763 nova_compute[231311]: 2025-11-23 21:18:18.504 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:18:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:18.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:18:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:19 np0005532763 nova_compute[231311]: 2025-11-23 21:18:19.750 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:19.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:18:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:20.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:21.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:18:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:18:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:18:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:18:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:22.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:23 np0005532763 nova_compute[231311]: 2025-11-23 21:18:23.507 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:23.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:24 np0005532763 nova_compute[231311]: 2025-11-23 21:18:24.783 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:24.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:18:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:25.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:26.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:18:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:18:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:18:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:18:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:27.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:28 np0005532763 systemd-logind[830]: New session 55 of user zuul.
Nov 23 16:18:28 np0005532763 systemd[1]: Started Session 55 of User zuul.
Nov 23 16:18:28 np0005532763 podman[244868]: 2025-11-23 21:18:28.400548761 +0000 UTC m=+0.088697030 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 16:18:28 np0005532763 nova_compute[231311]: 2025-11-23 21:18:28.534 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:28.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:29.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:29 np0005532763 nova_compute[231311]: 2025-11-23 21:18:29.785 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:18:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:30.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:31.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:18:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:18:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:18:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:18:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:32 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Nov 23 16:18:32 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1092092571' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Nov 23 16:18:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:32.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:33 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:18:33 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:18:33 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:18:33 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:18:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:33 np0005532763 nova_compute[231311]: 2025-11-23 21:18:33.580 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:33.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:34 np0005532763 nova_compute[231311]: 2025-11-23 21:18:34.788 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:34.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:18:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:35.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:36.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:18:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:18:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:18:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:18:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:37 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:18:37 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:18:37 np0005532763 ovs-vsctl[245376]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 23 16:18:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:38.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:38 np0005532763 podman[245386]: 2025-11-23 21:18:38.284388255 +0000 UTC m=+0.148163490 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 16:18:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:38 np0005532763 podman[245390]: 2025-11-23 21:18:38.313233893 +0000 UTC m=+0.177346178 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 23 16:18:38 np0005532763 nova_compute[231311]: 2025-11-23 21:18:38.584 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:38.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:39 np0005532763 virtqemud[230850]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 23 16:18:39 np0005532763 virtqemud[230850]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 23 16:18:39 np0005532763 virtqemud[230850]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 23 16:18:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:39 np0005532763 nova_compute[231311]: 2025-11-23 21:18:39.790 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:39 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: cache status {prefix=cache status} (starting...)
Nov 23 16:18:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:18:40 np0005532763 lvm[245732]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 23 16:18:40 np0005532763 lvm[245732]: VG ceph_vg0 finished
Nov 23 16:18:40 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: client ls {prefix=client ls} (starting...)
Nov 23 16:18:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:40.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:40 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: damage ls {prefix=damage ls} (starting...)
Nov 23 16:18:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:40.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:40 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: dump loads {prefix=dump loads} (starting...)
Nov 23 16:18:40 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Nov 23 16:18:40 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3489410471' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 23 16:18:40 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Nov 23 16:18:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:41 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Nov 23 16:18:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:41 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Nov 23 16:18:41 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 16:18:41 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3355703499' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 23 16:18:41 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Nov 23 16:18:41 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Nov 23 16:18:41 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2163106279' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Nov 23 16:18:41 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Nov 23 16:18:41 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: get subtrees {prefix=get subtrees} (starting...)
Nov 23 16:18:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:18:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:18:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:18:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:18:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:42.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:42 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: ops {prefix=ops} (starting...)
Nov 23 16:18:42 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Nov 23 16:18:42 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2372991850' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Nov 23 16:18:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:42 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Nov 23 16:18:42 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1648417367' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Nov 23 16:18:42 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: session ls {prefix=session ls} (starting...)
Nov 23 16:18:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:42.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:42 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Nov 23 16:18:42 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3156995856' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 23 16:18:42 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: status {prefix=status} (starting...)
Nov 23 16:18:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:43 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Nov 23 16:18:43 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4136860673' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 23 16:18:43 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Nov 23 16:18:43 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2429490044' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Nov 23 16:18:43 np0005532763 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 16:18:43 np0005532763 nova_compute[231311]: 2025-11-23 21:18:43.626 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:43 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Nov 23 16:18:43 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/121255193' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 23 16:18:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Nov 23 16:18:44 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3962888677' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Nov 23 16:18:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:44.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 23 16:18:44 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3690575320' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 23 16:18:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Nov 23 16:18:44 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3468948448' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Nov 23 16:18:44 np0005532763 nova_compute[231311]: 2025-11-23 21:18:44.792 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:18:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:44.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:18:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Nov 23 16:18:44 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1669620096' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Nov 23 16:18:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:18:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:45 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Nov 23 16:18:45 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3924259925' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 23 16:18:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:45 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Nov 23 16:18:45 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1492694451' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Nov 23 16:18:45 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Nov 23 16:18:45 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1580129398' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 23 16:18:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:46.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:46 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Nov 23 16:18:46 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4167770684' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70270976 unmapped: 991232 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70270976 unmapped: 991232 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862722 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70270976 unmapped: 991232 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70287360 unmapped: 974848 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70287360 unmapped: 974848 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70295552 unmapped: 966656 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70303744 unmapped: 958464 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862722 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70303744 unmapped: 958464 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70303744 unmapped: 958464 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70303744 unmapped: 958464 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7c4afc00 session 0x557d7be03860
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70303744 unmapped: 958464 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70311936 unmapped: 950272 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862722 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70311936 unmapped: 950272 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70303744 unmapped: 958464 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70320128 unmapped: 942080 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70336512 unmapped: 925696 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70344704 unmapped: 917504 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862722 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70344704 unmapped: 917504 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70352896 unmapped: 909312 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70361088 unmapped: 901120 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70361088 unmapped: 901120 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 892928 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862722 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 892928 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 892928 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 884736 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 876544 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70393856 unmapped: 868352 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862722 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70393856 unmapped: 868352 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70393856 unmapped: 868352 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 860160 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 77.185600281s of 77.216033936s, submitted: 9
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70418432 unmapped: 843776 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 835584 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 862131 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 835584 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70451200 unmapped: 811008 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70459392 unmapped: 802816 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70467584 unmapped: 794624 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70467584 unmapped: 794624 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70475776 unmapped: 786432 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70483968 unmapped: 778240 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70492160 unmapped: 770048 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70492160 unmapped: 770048 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70492160 unmapped: 770048 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70500352 unmapped: 761856 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70508544 unmapped: 753664 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70516736 unmapped: 745472 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70508544 unmapped: 753664 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70508544 unmapped: 753664 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70516736 unmapped: 745472 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70516736 unmapped: 745472 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70516736 unmapped: 745472 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 720896 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 720896 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 712704 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 712704 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 712704 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 704512 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 704512 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70565888 unmapped: 696320 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70574080 unmapped: 688128 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70574080 unmapped: 688128 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70574080 unmapped: 688128 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 679936 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 679936 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70590464 unmapped: 671744 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70590464 unmapped: 671744 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70590464 unmapped: 671744 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70598656 unmapped: 663552 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70598656 unmapped: 663552 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70598656 unmapped: 663552 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70606848 unmapped: 655360 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70606848 unmapped: 655360 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70606848 unmapped: 655360 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70615040 unmapped: 647168 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70623232 unmapped: 638976 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 630784 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70647808 unmapped: 614400 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70656000 unmapped: 606208 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70656000 unmapped: 606208 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70656000 unmapped: 606208 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70664192 unmapped: 598016 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70664192 unmapped: 598016 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70664192 unmapped: 598016 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70672384 unmapped: 589824 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70688768 unmapped: 573440 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 565248 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 565248 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 557056 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 557056 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 540672 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 532480 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 532480 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 524288 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 524288 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70754304 unmapped: 507904 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70762496 unmapped: 499712 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 491520 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 483328 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70787072 unmapped: 475136 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70803456 unmapped: 458752 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 450560 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70819840 unmapped: 442368 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 434176 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 434176 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70844416 unmapped: 417792 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70852608 unmapped: 409600 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70852608 unmapped: 409600 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70860800 unmapped: 401408 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70860800 unmapped: 401408 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70860800 unmapped: 401408 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70860800 unmapped: 401408 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70868992 unmapped: 393216 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70868992 unmapped: 393216 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70877184 unmapped: 385024 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70885376 unmapped: 376832 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 368640 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 368640 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70893568 unmapped: 368640 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70901760 unmapped: 360448 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 352256 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70909952 unmapped: 352256 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70942720 unmapped: 319488 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70942720 unmapped: 319488 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70942720 unmapped: 319488 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70967296 unmapped: 294912 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70967296 unmapped: 294912 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70975488 unmapped: 286720 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70983680 unmapped: 278528 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7a59d800 session 0x557d7a1aa960
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 70991872 unmapped: 270336 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71000064 unmapped: 262144 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71008256 unmapped: 253952 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71008256 unmapped: 253952 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71008256 unmapped: 253952 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71008256 unmapped: 253952 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71016448 unmapped: 245760 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71016448 unmapped: 245760 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 237568 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 237568 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 861540 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 237568 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71041024 unmapped: 221184 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71057408 unmapped: 204800 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71065600 unmapped: 196608 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 111.005218506s of 111.012519836s, submitted: 2
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71081984 unmapped: 180224 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863052 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 172032 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 163840 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 163840 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 163840 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71106560 unmapped: 155648 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 864564 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 147456 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 139264 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 139264 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 131072 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 131072 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 864564 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 131072 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 122880 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 106496 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 106496 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 98304 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 864564 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 98304 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7c620400 session 0x557d7b37b680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 73728 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 65536 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 65536 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 57344 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 864564 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 57344 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 57344 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 49152 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 40960 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 40960 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 864564 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 32768 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 32768 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 24576 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 24576 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 16384 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 864564 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 0 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 864564 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 1015808 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 37.157821655s of 37.165756226s, submitted: 2
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71294976 unmapped: 1015808 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 1007616 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 1007616 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71303168 unmapped: 1007616 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863382 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863382 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 966656 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 966656 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71352320 unmapped: 958464 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863382 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863382 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 909312 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 909312 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 909312 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d799aa800 session 0x557d7c116000
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863382 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863382 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71450624 unmapped: 860160 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71458816 unmapped: 851968 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71458816 unmapped: 851968 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71467008 unmapped: 843776 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71467008 unmapped: 843776 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 863382 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71475200 unmapped: 835584 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 38.006263733s of 38.013668060s, submitted: 2
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 811008 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 864894 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71499776 unmapped: 811008 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 778240 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71532544 unmapped: 778240 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 745472 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71589888 unmapped: 720896 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 696320 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 696320 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 696320 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 696320 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71614464 unmapped: 696320 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71639040 unmapped: 671744 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71639040 unmapped: 671744 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 655360 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71655424 unmapped: 655360 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 630784 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 630784 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 630784 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 622592 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71688192 unmapped: 622592 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71696384 unmapped: 614400 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 581632 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71745536 unmapped: 565248 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71745536 unmapped: 565248 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 548864 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71761920 unmapped: 548864 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 499712 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 499712 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 499712 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71819264 unmapped: 491520 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71835648 unmapped: 475136 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 466944 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71843840 unmapped: 466944 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71852032 unmapped: 458752 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71860224 unmapped: 450560 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71860224 unmapped: 450560 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71860224 unmapped: 450560 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 442368 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71868416 unmapped: 442368 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 434176 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 434176 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71876608 unmapped: 434176 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71884800 unmapped: 425984 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71884800 unmapped: 425984 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71884800 unmapped: 425984 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71892992 unmapped: 417792 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71892992 unmapped: 417792 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 409600 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 409600 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 409600 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 401408 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 401408 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71917568 unmapped: 393216 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71917568 unmapped: 393216 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71917568 unmapped: 393216 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 385024 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 385024 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 385024 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71942144 unmapped: 368640 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71942144 unmapped: 368640 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71950336 unmapped: 360448 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71950336 unmapped: 360448 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71950336 unmapped: 360448 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 352256 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71958528 unmapped: 352256 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 344064 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 5585 writes, 24K keys, 5585 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5585 writes, 902 syncs, 6.19 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5585 writes, 24K keys, 5585 commit groups, 1.0 writes per commit group, ingest: 19.00 MB, 0.03 MB/s#012Interval WAL: 5585 writes, 902 syncs, 6.19 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557d78bc9350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557d78bc9350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowd
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 270336 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 270336 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72040448 unmapped: 270336 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7c4af800 session 0x557d7a5d85a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72048640 unmapped: 262144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72048640 unmapped: 262144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72056832 unmapped: 253952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72056832 unmapped: 253952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 245760 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 245760 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72056832 unmapped: 253952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 245760 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 245760 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 866406 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72065024 unmapped: 245760 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72073216 unmapped: 237568 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72073216 unmapped: 237568 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72081408 unmapped: 229376 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72081408 unmapped: 229376 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 117.008460999s of 117.019020081s, submitted: 2
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869430 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72089600 unmapped: 221184 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72089600 unmapped: 221184 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72089600 unmapped: 221184 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 212992 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72097792 unmapped: 212992 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869430 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 204800 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 204800 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 204800 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72105984 unmapped: 204800 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72114176 unmapped: 196608 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 868248 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72122368 unmapped: 188416 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72130560 unmapped: 180224 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72130560 unmapped: 180224 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72130560 unmapped: 180224 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 172032 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 868248 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72138752 unmapped: 172032 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 163840 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 163840 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72146944 unmapped: 163840 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 155648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 868248 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72155136 unmapped: 155648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 147456 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 147456 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72163328 unmapped: 147456 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72171520 unmapped: 139264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 868248 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72179712 unmapped: 131072 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 122880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 122880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 122880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 122880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 868248 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 122880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72187904 unmapped: 122880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 114688 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72196096 unmapped: 114688 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 106496 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 868248 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 106496 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 72204288 unmapped: 106496 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 409600 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 409600 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d799ab800 session 0x557d7cb9cd20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71901184 unmapped: 409600 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 868248 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 401408 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71909376 unmapped: 401408 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71917568 unmapped: 393216 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71917568 unmapped: 393216 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 385024 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 868248 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 385024 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71925760 unmapped: 385024 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71933952 unmapped: 376832 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 47.687278748s of 47.701545715s, submitted: 4
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 71966720 unmapped: 344064 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73064448 unmapped: 1343488 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 868248 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 1220608 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 1220608 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 1220608 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 1220608 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 1220608 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 868248 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 1220608 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73187328 unmapped: 1220608 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73195520 unmapped: 1212416 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73195520 unmapped: 1212416 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73195520 unmapped: 1212416 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 868248 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 1204224 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.517592430s of 13.321490288s, submitted: 224
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d79d0fc00 session 0x557d7b37b2c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 1204224 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 1204224 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 1204224 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 1204224 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869760 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 1204224 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d799aa800 session 0x557d7c5363c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 1204224 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 1204224 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73203712 unmapped: 1204224 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 1196032 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869760 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73211904 unmapped: 1196032 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 1187840 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73220096 unmapped: 1187840 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73228288 unmapped: 1179648 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 1171456 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 869760 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73236480 unmapped: 1171456 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 1163264 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73244672 unmapped: 1163264 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.482439041s of 17.486621857s, submitted: 1
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 1155072 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 1155072 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 872193 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73252864 unmapped: 1155072 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 1146880 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73261056 unmapped: 1146880 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73269248 unmapped: 1138688 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73269248 unmapped: 1138688 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 872193 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 1130496 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 1130496 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7c4afc00 session 0x557d7c536b40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73277440 unmapped: 1130496 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73285632 unmapped: 1122304 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73285632 unmapped: 1122304 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 872193 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 1114112 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 1114112 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73293824 unmapped: 1114112 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73302016 unmapped: 1105920 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73302016 unmapped: 1105920 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 872193 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 1097728 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 1097728 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73310208 unmapped: 1097728 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73318400 unmapped: 1089536 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73326592 unmapped: 1081344 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 872193 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.273010254s of 22.286701202s, submitted: 3
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 1064960 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73342976 unmapped: 1064960 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 873705 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 873705 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7a59d800 session 0x557d7c1490e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 873705 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 1048576 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 873705 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 1048576 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 1048576 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 1048576 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 1048576 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 1048576 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 873705 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 1048576 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7c620400 session 0x557d7c5372c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 873705 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.598381042s of 30.603197098s, submitted: 1
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 875217 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 875217 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.709115028s of 12.713610649s, submitted: 1
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878241 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 877650 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7a59d800 session 0x557d7c537e00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 877650 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 877650 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 877650 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73351168 unmapped: 1056768 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 1048576 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 1048576 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73359360 unmapped: 1048576 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.747240067s of 25.769412994s, submitted: 3
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880674 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73367552 unmapped: 1040384 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73375744 unmapped: 1032192 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 1024000 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 1024000 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73383936 unmapped: 1024000 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880674 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879492 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879492 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73392128 unmapped: 1015808 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879492 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d79d0fc00 session 0x557d7d021e00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879492 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73400320 unmapped: 1007616 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879492 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879492 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73408512 unmapped: 999424 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 39.327392578s of 39.346740723s, submitted: 4
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73424896 unmapped: 983040 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881004 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73433088 unmapped: 974848 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73441280 unmapped: 966656 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d799ab800 session 0x557d7c149c20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 71.702980042s of 71.722465515s, submitted: 3
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881334 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880743 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880743 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7c4afc00 session 0x557d7c4430e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880743 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880743 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880743 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880743 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880743 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 36.037124634s of 36.047756195s, submitted: 2
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d799ab800 session 0x557d7c9ea000
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880152 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880152 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880152 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.927753448s of 16.932226181s, submitted: 1
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d799aa800 session 0x557d7c116f00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 55.639171600s of 55.643314362s, submitted: 1
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883176 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883176 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882585 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882585 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882585 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 892928 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 892928 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 892928 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d79d0fc00 session 0x557d7c9eb2c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 892928 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882585 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 892928 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 892928 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 892928 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882585 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882585 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882585 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 876544 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 876544 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 876544 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 876544 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882585 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 49.515476227s of 49.525783539s, submitted: 2
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881994 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881994 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881994 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881994 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881994 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.195587158s of 25.200448990s, submitted: 1
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883506 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7a59d800 session 0x557d7c9f25a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 885018 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 885018 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 885018 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 827392 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 827392 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.715751648s of 19.726030350s, submitted: 2
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 827392 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 827392 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73596928 unmapped: 811008 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888042 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887451 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 786432 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 786432 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 786432 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 786432 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 786432 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 786432 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73629696 unmapped: 778240 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73629696 unmapped: 778240 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73637888 unmapped: 770048 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7c620400 session 0x557d7ce4c000
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread fragmentation_score=0.000025 took=0.000041s
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 720896 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 720896 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 720896 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 720896 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 720896 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 704512 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 704512 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 704512 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 704512 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 62.199523926s of 62.215599060s, submitted: 4
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 688128 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888372 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 679936 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 679936 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 679936 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 679936 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 679936 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887781 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 671744 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 671744 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73744384 unmapped: 663552 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887190 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887190 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73768960 unmapped: 638976 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73768960 unmapped: 638976 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887190 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887190 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d79d0f000 session 0x557d7c9efa40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887190 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887190 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 6101 writes, 25K keys, 6101 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 6101 writes, 1158 syncs, 5.27 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 516 writes, 817 keys, 516 commit groups, 1.0 writes per commit group, ingest: 0.27 MB, 0.00 MB/s#012Interval WAL: 516 writes, 256 syncs, 2.02 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557d78bc9350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557d78bc9350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 581632 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 581632 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 581632 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 581632 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 573440 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887190 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 573440 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 573440 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 573440 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 573440 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 573440 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887190 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 573440 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 47.645763397s of 47.658298492s, submitted: 3
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888702 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888111 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888111 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d799aa800 session 0x557d7ce4da40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888111 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888111 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.442523956s of 28.453479767s, submitted: 2
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889623 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889623 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889032 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889032 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889032 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d799ab800 session 0x557d7ce80b40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889032 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.711717606s of 25.720556259s, submitted: 2
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74973184 unmapped: 483328 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76251136 unmapped: 253952 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76300288 unmapped: 204800 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 172032 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 172032 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889032 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 172032 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 172032 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 172032 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 172032 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 172032 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889032 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 172032 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.249409676s of 10.981097221s, submitted: 213
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 163840 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 892056 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 892056 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d79d0fc00 session 0x557d7ce67680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 891465 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76357632 unmapped: 147456 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 891465 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76357632 unmapped: 147456 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76357632 unmapped: 147456 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76357632 unmapped: 147456 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76357632 unmapped: 147456 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.958477020s of 22.970237732s, submitted: 3
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76357632 unmapped: 147456 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 892977 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76357632 unmapped: 147456 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76357632 unmapped: 147456 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894489 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 893307 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 893307 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 893307 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 893307 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 893307 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7a70c800 session 0x557d7c9f25a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 893307 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7a59d800 session 0x557d7c117e00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 893307 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 893307 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 46.180438995s of 46.197158813s, submitted: 4
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76423168 unmapped: 81920 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894819 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76423168 unmapped: 81920 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76423168 unmapped: 81920 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76423168 unmapped: 81920 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895740 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895740 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7c4ae400 session 0x557d7c536f00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895740 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895740 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895740 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895740 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 38.757907867s of 38.770950317s, submitted: 3
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 43.226833344s of 43.231521606s, submitted: 1
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898173 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d799aa800 session 0x557d7cf09e00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898173 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898173 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.084466934s of 18.093536377s, submitted: 2
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903659 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 40960 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76701696 unmapped: 17637376 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 145 ms_handle_reset con 0x557d799ab800 session 0x557d7ce67680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76701696 unmapped: 17637376 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 145 heartbeat osd_stat(store_statfs(0x4fbdf7000/0x0/0x4ffc00000, data 0xd70ce8/0xe23000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76701696 unmapped: 17637376 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 146 ms_handle_reset con 0x557d79d0fc00 session 0x557d7ce4de00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 999818 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fbdf4000/0x0/0x4ffc00000, data 0xd72e13/0xe27000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 146 handle_osd_map epochs [146,147], i have 146, src has [1,147]
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.925214767s of 10.168018341s, submitted: 38
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1001985 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf4000/0x0/0x4ffc00000, data 0xd72e13/0xe27000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1001394 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1001394 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 ms_handle_reset con 0x557d799aa800 session 0x557d7c9efe00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1001394 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1001394 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1001394 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.053102493s of 27.068101883s, submitted: 12
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1002906 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1001475 data_alloc: 218103808 data_used: 65536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf2000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 ms_handle_reset con 0x557d7c4ae400 session 0x557d7c336960
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 84492288 unmapped: 9846784 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 ms_handle_reset con 0x557d7c942000 session 0x557d7c336b40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf2000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 88104960 unmapped: 6234112 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1031723 data_alloc: 234881024 data_used: 11534336
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf2000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 88104960 unmapped: 6234112 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 147 handle_osd_map epochs [147,148], i have 147, src has [1,148]
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.397347450s of 14.407382011s, submitted: 2
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 88104960 unmapped: 6234112 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fbdee000/0x0/0x4ffc00000, data 0xd76ed1/0xe2d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 88104960 unmapped: 6234112 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 88104960 unmapped: 6234112 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 149 ms_handle_reset con 0x557d7d332800 session 0x557d7c336f00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 91455488 unmapped: 12402688 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1090641 data_alloc: 234881024 data_used: 11534336
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 149 ms_handle_reset con 0x557d7d332c00 session 0x557d7c3374a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 91455488 unmapped: 12402688 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 91455488 unmapped: 12402688 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 149 ms_handle_reset con 0x557d799aa800 session 0x557d7c337680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 149 heartbeat osd_stat(store_statfs(0x4fb7a5000/0x0/0x4ffc00000, data 0x13bf011/0x1476000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 91447296 unmapped: 12410880 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 149 ms_handle_reset con 0x557d7c4ae400 session 0x557d7c337860
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 149 ms_handle_reset con 0x557d7c942000 session 0x557d7c337c20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 91668480 unmapped: 12189696 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 91684864 unmapped: 12173312 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1096615 data_alloc: 234881024 data_used: 11681792
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97640448 unmapped: 6217728 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97665024 unmapped: 6193152 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 149 heartbeat osd_stat(store_statfs(0x4fb781000/0x0/0x4ffc00000, data 0x13e3034/0x149b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.443373680s of 13.634446144s, submitted: 43
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1141981 data_alloc: 234881024 data_used: 17829888
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fb77d000/0x0/0x4ffc00000, data 0x13e5006/0x149e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fb77d000/0x0/0x4ffc00000, data 0x13e5006/0x149e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1141981 data_alloc: 234881024 data_used: 17829888
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 103718912 unmapped: 4366336 heap: 108085248 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fab2f000/0x0/0x4ffc00000, data 0x2026006/0x20df000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 103350272 unmapped: 4734976 heap: 108085248 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 103989248 unmapped: 5144576 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 103989248 unmapped: 5144576 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245847 data_alloc: 234881024 data_used: 18145280
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f990a000/0x0/0x4ffc00000, data 0x20b1006/0x216a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104022016 unmapped: 5111808 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f990a000/0x0/0x4ffc00000, data 0x20b1006/0x216a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104062976 unmapped: 5070848 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104062976 unmapped: 5070848 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.709860802s of 14.013646126s, submitted: 123
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104079360 unmapped: 5054464 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f990a000/0x0/0x4ffc00000, data 0x20b1006/0x216a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104095744 unmapped: 5038080 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1241983 data_alloc: 234881024 data_used: 18153472
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104095744 unmapped: 5038080 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104095744 unmapped: 5038080 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104095744 unmapped: 5038080 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104095744 unmapped: 5038080 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f98f1000/0x0/0x4ffc00000, data 0x20d2006/0x218b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104095744 unmapped: 5038080 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1241983 data_alloc: 234881024 data_used: 18153472
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 4882432 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f98e8000/0x0/0x4ffc00000, data 0x20db006/0x2194000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1242087 data_alloc: 234881024 data_used: 18153472
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f98e8000/0x0/0x4ffc00000, data 0x20db006/0x2194000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f98e8000/0x0/0x4ffc00000, data 0x20db006/0x2194000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7d333400 session 0x557d7c08c780
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c146000 session 0x557d7c555a40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c146000 session 0x557d7a087c20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7a087a40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1242087 data_alloc: 234881024 data_used: 18153472
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c4ae400 session 0x557d7a086780
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105717760 unmapped: 3416064 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f98e8000/0x0/0x4ffc00000, data 0x20db006/0x2194000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c942000 session 0x557d7c0c5860
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.610626221s of 17.632879257s, submitted: 5
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7d333400 session 0x557d7c0c4960
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7d333400 session 0x557d7c443c20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c442f00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c146000 session 0x557d7c442d20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c4ae400 session 0x557d7c4434a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 106233856 unmapped: 10911744 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 106233856 unmapped: 10911744 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 106233856 unmapped: 10911744 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f91f0000/0x0/0x4ffc00000, data 0x27d2016/0x288c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 106233856 unmapped: 10911744 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298305 data_alloc: 234881024 data_used: 18677760
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 106233856 unmapped: 10911744 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f91f0000/0x0/0x4ffc00000, data 0x27d2016/0x288c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c942000 session 0x557d7c4432c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 11558912 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105644032 unmapped: 11501568 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f91f0000/0x0/0x4ffc00000, data 0x27d2016/0x288c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112066560 unmapped: 5079040 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f91f0000/0x0/0x4ffc00000, data 0x27d2016/0x288c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112107520 unmapped: 5038080 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347766 data_alloc: 234881024 data_used: 25698304
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f91f0000/0x0/0x4ffc00000, data 0x27d2016/0x288c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112107520 unmapped: 5038080 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112107520 unmapped: 5038080 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112107520 unmapped: 5038080 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112107520 unmapped: 5038080 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.472221375s of 13.600175858s, submitted: 24
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112148480 unmapped: 4997120 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347854 data_alloc: 234881024 data_used: 25698304
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f91f0000/0x0/0x4ffc00000, data 0x27d2016/0x288c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112148480 unmapped: 4997120 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f91ed000/0x0/0x4ffc00000, data 0x27d5016/0x288f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 4964352 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112189440 unmapped: 4956160 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113737728 unmapped: 3407872 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 4390912 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1401956 data_alloc: 234881024 data_used: 26808320
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 114925568 unmapped: 5373952 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8a4e000/0x0/0x4ffc00000, data 0x2f74016/0x302e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 114925568 unmapped: 5373952 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 114925568 unmapped: 5373952 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 114925568 unmapped: 5373952 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8a4a000/0x0/0x4ffc00000, data 0x2f78016/0x3032000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 114892800 unmapped: 5406720 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1413436 data_alloc: 234881024 data_used: 26882048
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.477304459s of 10.752337456s, submitted: 95
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 114892800 unmapped: 5406720 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 5152768 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 5152768 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c942000 session 0x557d7b4aa3c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7a04bc20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8a28000/0x0/0x4ffc00000, data 0x2f9a016/0x3054000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110256128 unmapped: 10043392 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7cf02f00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 10027008 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252882 data_alloc: 234881024 data_used: 18665472
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 10027008 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f98df000/0x0/0x4ffc00000, data 0x20e4006/0x219d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110288896 unmapped: 10010624 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110288896 unmapped: 10010624 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f98da000/0x0/0x4ffc00000, data 0x20e9006/0x21a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110288896 unmapped: 10010624 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7d333000 session 0x557d7c9ee3c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7d332800 session 0x557d7c336960
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110313472 unmapped: 9986048 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1073048 data_alloc: 234881024 data_used: 12181504
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c9ee000
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105013248 unmapped: 15286272 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70c800 session 0x557d7c149a40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105013248 unmapped: 15286272 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105013248 unmapped: 15286272 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fac48000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fac48000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065848 data_alloc: 234881024 data_used: 12066816
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d79a99c20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fac48000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065848 data_alloc: 234881024 data_used: 12066816
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fac48000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065848 data_alloc: 234881024 data_used: 12066816
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 24.807607651s of 25.014438629s, submitted: 70
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 15622144 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 15622144 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 15622144 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 15622144 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c942000 session 0x557d7ae2b680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7d333000 session 0x557d7cea5a40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c08de00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 15622144 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d7c555a40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1096410 data_alloc: 234881024 data_used: 12066816
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70c800 session 0x557d7c055680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa473000/0x0/0x4ffc00000, data 0x1550fe3/0x1609000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104955904 unmapped: 19677184 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104955904 unmapped: 19677184 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa473000/0x0/0x4ffc00000, data 0x1550fe3/0x1609000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104955904 unmapped: 19677184 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa473000/0x0/0x4ffc00000, data 0x1550fe3/0x1609000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c942000 session 0x557d7b2abc20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104955904 unmapped: 19677184 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef5c00 session 0x557d7c116000
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104955904 unmapped: 19677184 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1140250 data_alloc: 234881024 data_used: 12066816
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7d020d20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d7c1163c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104988672 unmapped: 19644416 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105193472 unmapped: 19439616 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa472000/0x0/0x4ffc00000, data 0x1551006/0x160a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa472000/0x0/0x4ffc00000, data 0x1551006/0x160a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194519 data_alloc: 234881024 data_used: 19894272
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa472000/0x0/0x4ffc00000, data 0x1551006/0x160a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194519 data_alloc: 234881024 data_used: 19894272
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa472000/0x0/0x4ffc00000, data 0x1551006/0x160a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.159788132s of 22.263095856s, submitted: 29
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115286016 unmapped: 9347072 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 8691712 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113426432 unmapped: 11206656 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1333801 data_alloc: 234881024 data_used: 20561920
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113426432 unmapped: 11206656 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113426432 unmapped: 11206656 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8e34000/0x0/0x4ffc00000, data 0x2777006/0x2830000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113426432 unmapped: 11206656 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113426432 unmapped: 11206656 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113016832 unmapped: 11616256 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1328289 data_alloc: 234881024 data_used: 20561920
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113016832 unmapped: 11616256 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8e1d000/0x0/0x4ffc00000, data 0x2796006/0x284f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113016832 unmapped: 11616256 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8e1d000/0x0/0x4ffc00000, data 0x2796006/0x284f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113049600 unmapped: 11583488 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113049600 unmapped: 11583488 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113049600 unmapped: 11583488 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1328289 data_alloc: 234881024 data_used: 20561920
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.521698952s of 12.875432968s, submitted: 169
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 11476992 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8e10000/0x0/0x4ffc00000, data 0x27a3006/0x285c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113188864 unmapped: 11444224 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113197056 unmapped: 11436032 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8e10000/0x0/0x4ffc00000, data 0x27a3006/0x285c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 11427840 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 11427840 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1328425 data_alloc: 234881024 data_used: 20561920
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8e10000/0x0/0x4ffc00000, data 0x27a3006/0x285c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 11427840 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 11427840 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 11427840 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 11427840 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 11427840 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1329345 data_alloc: 234881024 data_used: 20578304
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8e0d000/0x0/0x4ffc00000, data 0x27a6006/0x285f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 11427840 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 11419648 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 11419648 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70c800 session 0x557d7ce7f680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.209961891s of 13.227853775s, submitted: 4
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108429312 unmapped: 16203776 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef5800 session 0x557d7a1aa3c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1081933 data_alloc: 234881024 data_used: 12066816
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1081933 data_alloc: 234881024 data_used: 12066816
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1081933 data_alloc: 234881024 data_used: 12066816
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1081933 data_alloc: 234881024 data_used: 12066816
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.695468903s of 20.757802963s, submitted: 25
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef4400 session 0x557d7c149c20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 24166400 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170017 data_alloc: 234881024 data_used: 12066816
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 24166400 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9cc0000/0x0/0x4ffc00000, data 0x18f3fe3/0x19ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 24166400 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 24166400 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 24166400 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9cc0000/0x0/0x4ffc00000, data 0x18f3fe3/0x19ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 24166400 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9cc0000/0x0/0x4ffc00000, data 0x18f3fe3/0x19ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170017 data_alloc: 234881024 data_used: 12066816
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 24166400 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef4400 session 0x557d7c9ebe00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7d002000
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9cc0000/0x0/0x4ffc00000, data 0x18f3fe3/0x19ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d7b37a1e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108109824 unmapped: 24403968 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108109824 unmapped: 24403968 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108109824 unmapped: 24403968 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108109824 unmapped: 24403968 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1089434 data_alloc: 234881024 data_used: 12066816
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.831846237s of 10.980201721s, submitted: 47
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70cc00 session 0x557d7c7b5680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70d000 session 0x557d7be034a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70c800 session 0x557d7b37b2c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70cc00 session 0x557d7b2aad20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70d000 session 0x557d7d0205a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108445696 unmapped: 24068096 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108445696 unmapped: 24068096 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa686000/0x0/0x4ffc00000, data 0xf2d045/0xfe6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108445696 unmapped: 24068096 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108445696 unmapped: 24068096 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108445696 unmapped: 24068096 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7d021680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1111663 data_alloc: 234881024 data_used: 12066816
Nov 23 16:18:46 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa686000/0x0/0x4ffc00000, data 0xf2d045/0xfe6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108470272 unmapped: 24043520 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108150784 unmapped: 24363008 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa686000/0x0/0x4ffc00000, data 0xf2d045/0xfe6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108150784 unmapped: 24363008 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2023134629' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108150784 unmapped: 24363008 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108150784 unmapped: 24363008 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1116031 data_alloc: 234881024 data_used: 12627968
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108150784 unmapped: 24363008 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa686000/0x0/0x4ffc00000, data 0xf2d045/0xfe6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108150784 unmapped: 24363008 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108150784 unmapped: 24363008 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa686000/0x0/0x4ffc00000, data 0xf2d045/0xfe6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 106979328 unmapped: 25534464 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa686000/0x0/0x4ffc00000, data 0xf2d045/0xfe6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 106979328 unmapped: 25534464 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1116031 data_alloc: 234881024 data_used: 12627968
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 106979328 unmapped: 25534464 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.528369904s of 16.666515350s, submitted: 46
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 109846528 unmapped: 22667264 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110518272 unmapped: 21995520 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110682112 unmapped: 21831680 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9e16000/0x0/0x4ffc00000, data 0x179d045/0x1856000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110010368 unmapped: 22503424 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182843 data_alloc: 234881024 data_used: 12701696
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110010368 unmapped: 22503424 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110010368 unmapped: 22503424 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9e16000/0x0/0x4ffc00000, data 0x179d045/0x1856000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182859 data_alloc: 234881024 data_used: 12701696
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9e16000/0x0/0x4ffc00000, data 0x179d045/0x1856000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182859 data_alloc: 234881024 data_used: 12701696
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9e16000/0x0/0x4ffc00000, data 0x179d045/0x1856000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a0c2000 session 0x557d7ce7fe00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7ce7eb40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a0c2000 session 0x557d7c9eb4a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70c800 session 0x557d7a0863c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.219844818s of 16.448907852s, submitted: 71
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70cc00 session 0x557d7c7b52c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70d000 session 0x557d7c443e00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c9ee1e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a0c2000 session 0x557d7ce7e000
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70c800 session 0x557d7c7b50e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111042560 unmapped: 27295744 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111042560 unmapped: 27295744 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265830 data_alloc: 234881024 data_used: 12701696
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111042560 unmapped: 27295744 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111042560 unmapped: 27295744 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111042560 unmapped: 27295744 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9420000/0x0/0x4ffc00000, data 0x2192055/0x224c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9420000/0x0/0x4ffc00000, data 0x2192055/0x224c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111050752 unmapped: 27287552 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111050752 unmapped: 27287552 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265830 data_alloc: 234881024 data_used: 12701696
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111050752 unmapped: 27287552 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70cc00 session 0x557d7c116b40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9420000/0x0/0x4ffc00000, data 0x2192055/0x224c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 28319744 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110043136 unmapped: 28295168 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1339792 data_alloc: 234881024 data_used: 21274624
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f93fb000/0x0/0x4ffc00000, data 0x21b6078/0x2271000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1339792 data_alloc: 234881024 data_used: 21274624
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f93fb000/0x0/0x4ffc00000, data 0x21b6078/0x2271000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.856174469s of 20.003021240s, submitted: 38
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118865920 unmapped: 19472384 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 18563072 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1466068 data_alloc: 234881024 data_used: 22339584
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 18563072 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8673000/0x0/0x4ffc00000, data 0x2f3e078/0x2ff9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 18563072 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 18563072 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 18563072 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8673000/0x0/0x4ffc00000, data 0x2f3e078/0x2ff9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119808000 unmapped: 18530304 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1467556 data_alloc: 234881024 data_used: 22343680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119939072 unmapped: 18399232 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119947264 unmapped: 18391040 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7ce80960
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c0fa5a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119963648 unmapped: 18374656 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.752678871s of 10.058499336s, submitted: 128
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7ce7fe00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112058368 unmapped: 26279936 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9e15000/0x0/0x4ffc00000, data 0x179d045/0x1856000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112058368 unmapped: 26279936 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1202264 data_alloc: 234881024 data_used: 11063296
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9e15000/0x0/0x4ffc00000, data 0x179d045/0x1856000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112058368 unmapped: 26279936 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112058368 unmapped: 26279936 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d7b4aa000
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef5800 session 0x557d7c055860
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 26263552 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a0c2000 session 0x557d7c5552c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9e16000/0x0/0x4ffc00000, data 0x179d045/0x1856000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1119422 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1119422 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1119422 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111230976 unmapped: 27107328 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1119422 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111230976 unmapped: 27107328 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111230976 unmapped: 27107328 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111230976 unmapped: 27107328 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111230976 unmapped: 27107328 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111230976 unmapped: 27107328 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1119422 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111230976 unmapped: 27107328 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111230976 unmapped: 27107328 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111239168 unmapped: 27099136 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111239168 unmapped: 27099136 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 27090944 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1119422 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 27090944 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 27090944 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 34.560401917s of 34.893112183s, submitted: 105
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111263744 unmapped: 27074560 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1,4])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c0c52c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7a387680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d7cea4780
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef5800 session 0x557d7c9ee1e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70c800 session 0x557d7c9ef860
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111493120 unmapped: 36347904 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111493120 unmapped: 36347904 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253071 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 36339712 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c9ef0e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 36339712 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c9ee960
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111509504 unmapped: 36331520 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f95d3000/0x0/0x4ffc00000, data 0x1fe0fe3/0x2099000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d7c9ee000
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef5800 session 0x557d7c9ef4a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111902720 unmapped: 35938304 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111902720 unmapped: 35938304 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262404 data_alloc: 234881024 data_used: 11075584
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 27385856 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 26066944 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 26066944 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 26058752 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f95ae000/0x0/0x4ffc00000, data 0x2005006/0x20be000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121446400 unmapped: 26394624 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1387956 data_alloc: 251658240 data_used: 28327936
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121446400 unmapped: 26394624 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121446400 unmapped: 26394624 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121446400 unmapped: 26394624 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121446400 unmapped: 26394624 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121446400 unmapped: 26394624 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f95ae000/0x0/0x4ffc00000, data 0x2005006/0x20be000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1387956 data_alloc: 251658240 data_used: 28327936
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.526510239s of 17.687055588s, submitted: 31
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128393216 unmapped: 19447808 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8b13000/0x0/0x4ffc00000, data 0x2aa0006/0x2b59000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128761856 unmapped: 19079168 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8ae5000/0x0/0x4ffc00000, data 0x2ace006/0x2b87000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1487708 data_alloc: 251658240 data_used: 29290496
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8ae5000/0x0/0x4ffc00000, data 0x2ace006/0x2b87000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8ae2000/0x0/0x4ffc00000, data 0x2ad1006/0x2b8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1486548 data_alloc: 251658240 data_used: 29282304
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 8518 writes, 34K keys, 8518 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 8518 writes, 2145 syncs, 3.97 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2417 writes, 8796 keys, 2417 commit groups, 1.0 writes per commit group, ingest: 9.65 MB, 0.02 MB/s#012Interval WAL: 2417 writes, 987 syncs, 2.45 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8ae2000/0x0/0x4ffc00000, data 0x2ad1006/0x2b8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1486548 data_alloc: 251658240 data_used: 29282304
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8ae2000/0x0/0x4ffc00000, data 0x2ad1006/0x2b8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1487156 data_alloc: 251658240 data_used: 29265920
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.798316956s of 20.030382156s, submitted: 95
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1487388 data_alloc: 251658240 data_used: 29265920
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8ae0000/0x0/0x4ffc00000, data 0x2ad3006/0x2b8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128278528 unmapped: 19562496 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7af13c00 session 0x557d7cb9c3c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c08c5a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 18423808 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 18423808 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 18423808 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f83ac000/0x0/0x4ffc00000, data 0x3206068/0x32c0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 18391040 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1548005 data_alloc: 251658240 data_used: 29265920
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c055680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d7b4aa5a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 18374656 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef5800 session 0x557d7c9f34a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.146596909s of 11.267215729s, submitted: 38
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c93e400 session 0x557d7c0c5c20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128778240 unmapped: 19062784 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128778240 unmapped: 19062784 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8386000/0x0/0x4ffc00000, data 0x322a09b/0x32e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130711552 unmapped: 17129472 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133169152 unmapped: 14671872 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1594572 data_alloc: 251658240 data_used: 35155968
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133185536 unmapped: 14655488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133185536 unmapped: 14655488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133185536 unmapped: 14655488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8386000/0x0/0x4ffc00000, data 0x322a09b/0x32e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133193728 unmapped: 14647296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133193728 unmapped: 14647296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1595180 data_alloc: 251658240 data_used: 35184640
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133193728 unmapped: 14647296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8386000/0x0/0x4ffc00000, data 0x322a09b/0x32e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133193728 unmapped: 14647296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.182846069s of 11.216772079s, submitted: 9
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133218304 unmapped: 14622720 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8385000/0x0/0x4ffc00000, data 0x322b09b/0x32e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137166848 unmapped: 10674176 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138264576 unmapped: 9576448 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1676692 data_alloc: 251658240 data_used: 36057088
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138297344 unmapped: 9543680 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138297344 unmapped: 9543680 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b62000/0x0/0x4ffc00000, data 0x3a4d09b/0x3b09000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138297344 unmapped: 9543680 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138297344 unmapped: 9543680 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138297344 unmapped: 9543680 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b62000/0x0/0x4ffc00000, data 0x3a4d09b/0x3b09000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1677148 data_alloc: 251658240 data_used: 36069376
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138297344 unmapped: 9543680 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b3f000/0x0/0x4ffc00000, data 0x3a7109b/0x3b2d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1676300 data_alloc: 251658240 data_used: 36073472
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.274935722s of 12.588764191s, submitted: 93
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b3b000/0x0/0x4ffc00000, data 0x3a7509b/0x3b31000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1676196 data_alloc: 251658240 data_used: 36073472
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b3b000/0x0/0x4ffc00000, data 0x3a7509b/0x3b31000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7ce67860
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1676116 data_alloc: 251658240 data_used: 36073472
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b38000/0x0/0x4ffc00000, data 0x3a7809b/0x3b34000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138289152 unmapped: 9551872 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.778336525s of 10.798649788s, submitted: 5
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138289152 unmapped: 9551872 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138289152 unmapped: 9551872 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b34000/0x0/0x4ffc00000, data 0x3a7909b/0x3b35000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1676156 data_alloc: 251658240 data_used: 36073472
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138321920 unmapped: 9519104 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b34000/0x0/0x4ffc00000, data 0x3a7909b/0x3b35000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138321920 unmapped: 9519104 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1676156 data_alloc: 251658240 data_used: 36073472
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138321920 unmapped: 9519104 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b37000/0x0/0x4ffc00000, data 0x3a7909b/0x3b35000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138321920 unmapped: 9519104 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.394987106s of 11.418004036s, submitted: 6
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138346496 unmapped: 9494528 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 9437184 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138625024 unmapped: 9216000 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1675828 data_alloc: 251658240 data_used: 36073472
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b36000/0x0/0x4ffc00000, data 0x3a7a09b/0x3b36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,1])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1675996 data_alloc: 251658240 data_used: 36073472
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b36000/0x0/0x4ffc00000, data 0x3a7a09b/0x3b36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138756096 unmapped: 9084928 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.908274651s of 12.600452423s, submitted: 229
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1676012 data_alloc: 251658240 data_used: 36073472
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b36000/0x0/0x4ffc00000, data 0x3a7a09b/0x3b36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b36000/0x0/0x4ffc00000, data 0x3a7a09b/0x3b36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138756096 unmapped: 9084928 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138756096 unmapped: 9084928 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138756096 unmapped: 9084928 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b36000/0x0/0x4ffc00000, data 0x3a7a09b/0x3b36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138756096 unmapped: 9084928 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138756096 unmapped: 9084928 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1676012 data_alloc: 251658240 data_used: 36073472
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b36000/0x0/0x4ffc00000, data 0x3a7a09b/0x3b36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138756096 unmapped: 9084928 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b35000/0x0/0x4ffc00000, data 0x3a7b09b/0x3b37000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138756096 unmapped: 9084928 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138764288 unmapped: 9076736 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c0c4780
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7ceb23c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b35000/0x0/0x4ffc00000, data 0x3a7b09b/0x3b37000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138764288 unmapped: 9076736 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c93e400 session 0x557d7ce7e780
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132513792 unmapped: 15327232 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1500695 data_alloc: 251658240 data_used: 28966912
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.117815018s of 10.239953041s, submitted: 35
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f879a000/0x0/0x4ffc00000, data 0x2ad8006/0x2b91000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132513792 unmapped: 15327232 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132513792 unmapped: 15327232 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132513792 unmapped: 15327232 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132513792 unmapped: 15327232 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132513792 unmapped: 15327232 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70cc00 session 0x557d7ceb32c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7c08d4a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1499887 data_alloc: 251658240 data_used: 28966912
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c117860
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa814000/0x0/0x4ffc00000, data 0xd9f006/0xe58000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147879 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147879 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147879 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147879 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7a5d8780
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7ae2b0e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c93e400 session 0x557d7b2c1a40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c9eef00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.234661102s of 27.350326538s, submitted: 42
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7a5d85a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7c443860
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7c4434a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef5800 session 0x557d7c442f00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c443e00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 41517056 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 41517056 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 41517056 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280731 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 41517056 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c9eeb40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120111104 unmapped: 41451520 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120135680 unmapped: 41426944 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f97b0000/0x0/0x4ffc00000, data 0x1e01078/0x1ebc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f97b0000/0x0/0x4ffc00000, data 0x1e01078/0x1ebc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120397824 unmapped: 41164800 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f97b0000/0x0/0x4ffc00000, data 0x1e01078/0x1ebc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 35766272 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1399096 data_alloc: 251658240 data_used: 27426816
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 35766272 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7b4ab4a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7cea4780
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01000 session 0x557d7c9ee000
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f97b0000/0x0/0x4ffc00000, data 0x1e01078/0x1ebc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162712 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa627000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162712 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01000 session 0x557d7ae2b0e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c9eb4a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c9eab40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7c9eba40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.146602631s of 18.466539383s, submitted: 106
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 42254336 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7c9ea000
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7a5d90e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7a5d85a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c0c5680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7cea4000
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9c57000/0x0/0x4ffc00000, data 0x195bff3/0x1a15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121626624 unmapped: 44138496 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121626624 unmapped: 44138496 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01000 session 0x557d7c9ea1e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121626624 unmapped: 44138496 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01000 session 0x557d7c0fa1e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121536512 unmapped: 44228608 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1259248 data_alloc: 234881024 data_used: 10432512
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7cea4d20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7d0205a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121487360 unmapped: 44277760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 44474368 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9c55000/0x0/0x4ffc00000, data 0x195c026/0x1a17000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 124338176 unmapped: 41426944 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7c0fbe00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01400 session 0x557d7c5552c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 124321792 unmapped: 41443328 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119963648 unmapped: 45801472 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1174413 data_alloc: 234881024 data_used: 10432512
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c0fa1e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119971840 unmapped: 45793280 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: mgrc ms_handle_reset ms_handle_reset con 0x557d7a828000
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/844402651
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/844402651,v1:192.168.122.100:6801/844402651]
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: mgrc handle_mgr_configure stats_period=5
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1173148 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1173148 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1173148 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1173148 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1173148 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 37.237335205s of 37.444107056s, submitted: 49
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7c1172c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7c7b4b40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120184832 unmapped: 45580288 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9ede000/0x0/0x4ffc00000, data 0x16d5045/0x178e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120184832 unmapped: 45580288 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1242171 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120184832 unmapped: 45580288 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120184832 unmapped: 45580288 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120184832 unmapped: 45580288 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120184832 unmapped: 45580288 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9ede000/0x0/0x4ffc00000, data 0x16d5045/0x178e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120184832 unmapped: 45580288 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1242171 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01000 session 0x557d7c7b41e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120487936 unmapped: 45277184 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120504320 unmapped: 45260800 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9eba000/0x0/0x4ffc00000, data 0x16f9045/0x17b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1312292 data_alloc: 234881024 data_used: 20238336
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9eba000/0x0/0x4ffc00000, data 0x16f9045/0x17b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9eba000/0x0/0x4ffc00000, data 0x16f9045/0x17b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9eba000/0x0/0x4ffc00000, data 0x16f9045/0x17b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9eba000/0x0/0x4ffc00000, data 0x16f9045/0x17b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1312292 data_alloc: 234881024 data_used: 20238336
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.107950211s of 19.235004425s, submitted: 37
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 38871040 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 38871040 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 38846464 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9a73000/0x0/0x4ffc00000, data 0x1b40045/0x1bf9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357756 data_alloc: 234881024 data_used: 20475904
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 38846464 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 38846464 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 38846464 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9a73000/0x0/0x4ffc00000, data 0x1b40045/0x1bf9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 38813696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 38813696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357756 data_alloc: 234881024 data_used: 20475904
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9a73000/0x0/0x4ffc00000, data 0x1b40045/0x1bf9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 38780928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 38780928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 38780928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 38780928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9a73000/0x0/0x4ffc00000, data 0x1b40045/0x1bf9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 38780928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357756 data_alloc: 234881024 data_used: 20475904
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9a73000/0x0/0x4ffc00000, data 0x1b40045/0x1bf9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 38780928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 38772736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 38772736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.788988113s of 15.941099167s, submitted: 48
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01800 session 0x557d7c0fa5a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01c00 session 0x557d7cf02000
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7ceb2f00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9a73000/0x0/0x4ffc00000, data 0x1b40045/0x1bf9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1185834 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa49e000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1185834 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa49e000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa49e000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1185834 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa49e000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120463360 unmapped: 45301760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120463360 unmapped: 45301760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120463360 unmapped: 45301760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1185834 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120463360 unmapped: 45301760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa49e000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 45293568 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 45293568 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 45293568 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7b44bc20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7c9eb680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 45293568 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01000 session 0x557d7c9efe00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1185834 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c7b5a40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.021965027s of 22.198305130s, submitted: 38
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7c0fbc20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7a087860
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01c00 session 0x557d7cea5e00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00c00 session 0x557d7cb9c3c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7ceb32c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 44818432 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120954880 unmapped: 44810240 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9fd4000/0x0/0x4ffc00000, data 0x15de055/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 44802048 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 44802048 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 44802048 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258869 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7ce7e780
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 44802048 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7c1161e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01c00 session 0x557d7ce810e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9fd4000/0x0/0x4ffc00000, data 0x15de055/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00800 session 0x557d7ce7f4a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 44638208 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 44613632 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9b9f000/0x0/0x4ffc00000, data 0x1602065/0x16bd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 44736512 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 44916736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1315524 data_alloc: 234881024 data_used: 17928192
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 44916736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 44916736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 44916736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9b9f000/0x0/0x4ffc00000, data 0x1602065/0x16bd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 44916736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 44916736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1315524 data_alloc: 234881024 data_used: 17928192
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 44916736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9b9f000/0x0/0x4ffc00000, data 0x1602065/0x16bd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 44916736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7a387860
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7a387c20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01c00 session 0x557d7a3870e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00400 session 0x557d7ce81860
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.361158371s of 17.506059647s, submitted: 48
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00000 session 0x557d7ce812c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 44023808 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 44023808 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f90f8000/0x0/0x4ffc00000, data 0x20a9065/0x2164000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [0,3,3,1])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130891776 unmapped: 34873344 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1453440 data_alloc: 234881024 data_used: 19214336
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129777664 unmapped: 35987456 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129818624 unmapped: 35946496 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8c51000/0x0/0x4ffc00000, data 0x254f065/0x260a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129818624 unmapped: 35946496 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129818624 unmapped: 35946496 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129818624 unmapped: 35946496 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1453290 data_alloc: 234881024 data_used: 19423232
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7a04b680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7c9eb680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129818624 unmapped: 35946496 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00000 session 0x557d7c9eb4a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00400 session 0x557d7c9eab40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129835008 unmapped: 35930112 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8c52000/0x0/0x4ffc00000, data 0x254f065/0x260a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129835008 unmapped: 35930112 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131047424 unmapped: 34717696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131047424 unmapped: 34717696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1482239 data_alloc: 234881024 data_used: 23588864
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131047424 unmapped: 34717696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131047424 unmapped: 34717696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131047424 unmapped: 34717696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8c52000/0x0/0x4ffc00000, data 0x254f065/0x260a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8c52000/0x0/0x4ffc00000, data 0x254f065/0x260a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131047424 unmapped: 34717696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131047424 unmapped: 34717696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1482239 data_alloc: 234881024 data_used: 23588864
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8c52000/0x0/0x4ffc00000, data 0x254f065/0x260a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131080192 unmapped: 34684928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131080192 unmapped: 34684928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131080192 unmapped: 34684928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131080192 unmapped: 34684928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.508337021s of 21.807014465s, submitted: 132
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8427000/0x0/0x4ffc00000, data 0x2d72065/0x2e2d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134545408 unmapped: 31219712 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1544079 data_alloc: 234881024 data_used: 23617536
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138452992 unmapped: 27312128 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7f42000/0x0/0x4ffc00000, data 0x325f065/0x331a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138682368 unmapped: 27082752 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138240000 unmapped: 27525120 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138240000 unmapped: 27525120 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7ea1000/0x0/0x4ffc00000, data 0x32f7065/0x33b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138272768 unmapped: 27492352 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1601253 data_alloc: 234881024 data_used: 25538560
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138272768 unmapped: 27492352 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138272768 unmapped: 27492352 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137764864 unmapped: 28000256 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7e89000/0x0/0x4ffc00000, data 0x3318065/0x33d3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137773056 unmapped: 27992064 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7e89000/0x0/0x4ffc00000, data 0x3318065/0x33d3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137781248 unmapped: 27983872 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:46.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1593645 data_alloc: 234881024 data_used: 25538560
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137789440 unmapped: 27975680 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.961110115s of 12.277703285s, submitted: 147
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137789440 unmapped: 27975680 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7e89000/0x0/0x4ffc00000, data 0x3318065/0x33d3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137789440 unmapped: 27975680 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7e89000/0x0/0x4ffc00000, data 0x3318065/0x33d3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137789440 unmapped: 27975680 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7e89000/0x0/0x4ffc00000, data 0x3318065/0x33d3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137789440 unmapped: 27975680 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1593645 data_alloc: 234881024 data_used: 25538560
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137814016 unmapped: 27951104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01c00 session 0x557d7a04ad20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c4ae800 session 0x557d7ceb25a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137814016 unmapped: 27951104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7b2aab40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132710400 unmapped: 33054720 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132710400 unmapped: 33054720 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f91b9000/0x0/0x4ffc00000, data 0x1fe8065/0x20a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132710400 unmapped: 33054720 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1409603 data_alloc: 234881024 data_used: 19427328
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00800 session 0x557d7ceb2780
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7cea5c20
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132718592 unmapped: 33046528 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.836650848s of 10.001040459s, submitted: 60
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7b2ab860
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212612 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212612 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212612 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212612 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128606208 unmapped: 37158912 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128606208 unmapped: 37158912 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128606208 unmapped: 37158912 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128606208 unmapped: 37158912 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212612 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128606208 unmapped: 37158912 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128606208 unmapped: 37158912 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7a5d8000
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7b4aa3c0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7c7b4960
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c4ae800 session 0x557d7c054b40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [0,0,0,0,0,1])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.840406418s of 25.882516861s, submitted: 12
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00800 session 0x557d7c0541e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00800 session 0x557d7b2aab40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7b2ab860
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7b2ab0e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7ceb2780
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128778240 unmapped: 40665088 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96ce000/0x0/0x4ffc00000, data 0x1ad4055/0x1b8e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128778240 unmapped: 40665088 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128778240 unmapped: 40665088 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1315173 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128786432 unmapped: 40656896 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128786432 unmapped: 40656896 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128786432 unmapped: 40656896 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c4ae800 session 0x557d7ceb21e0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128786432 unmapped: 40656896 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7ceb3a40
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96ce000/0x0/0x4ffc00000, data 0x1ad4055/0x1b8e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96ce000/0x0/0x4ffc00000, data 0x1ad4055/0x1b8e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7c9eb680
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7c9eb4a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129114112 unmapped: 40329216 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1320008 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96a9000/0x0/0x4ffc00000, data 0x1af8065/0x1bb3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129114112 unmapped: 40329216 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131227648 unmapped: 38215680 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134152192 unmapped: 35291136 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134160384 unmapped: 35282944 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96a9000/0x0/0x4ffc00000, data 0x1af8065/0x1bb3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134160384 unmapped: 35282944 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1414684 data_alloc: 234881024 data_used: 24420352
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134160384 unmapped: 35282944 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96a9000/0x0/0x4ffc00000, data 0x1af8065/0x1bb3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134193152 unmapped: 35250176 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96a9000/0x0/0x4ffc00000, data 0x1af8065/0x1bb3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134225920 unmapped: 35217408 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96a9000/0x0/0x4ffc00000, data 0x1af8065/0x1bb3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134225920 unmapped: 35217408 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134225920 unmapped: 35217408 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1414684 data_alloc: 234881024 data_used: 24420352
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134225920 unmapped: 35217408 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134234112 unmapped: 35209216 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.733076096s of 19.895635605s, submitted: 44
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96a9000/0x0/0x4ffc00000, data 0x1af8065/0x1bb3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138870784 unmapped: 30572544 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 30326784 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 30326784 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1481696 data_alloc: 234881024 data_used: 25444352
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 30326784 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8fe6000/0x0/0x4ffc00000, data 0x21bb065/0x2276000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139132928 unmapped: 30310400 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8fe6000/0x0/0x4ffc00000, data 0x21bb065/0x2276000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139132928 unmapped: 30310400 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139141120 unmapped: 30302208 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139157504 unmapped: 30285824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1480672 data_alloc: 234881024 data_used: 25448448
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8fe4000/0x0/0x4ffc00000, data 0x21bd065/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139157504 unmapped: 30285824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139157504 unmapped: 30285824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139157504 unmapped: 30285824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139157504 unmapped: 30285824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.296658516s of 12.521332741s, submitted: 87
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00800 session 0x557d7c08d4a0
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00000 session 0x557d7c554f00
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130342912 unmapped: 39100416 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1235822 data_alloc: 234881024 data_used: 10539008
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c442960
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130400256 unmapped: 39043072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130400256 unmapped: 39043072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130400256 unmapped: 39043072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130400256 unmapped: 39043072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130400256 unmapped: 39043072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130400256 unmapped: 39043072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130400256 unmapped: 39043072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130400256 unmapped: 39043072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130408448 unmapped: 39034880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130408448 unmapped: 39034880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130408448 unmapped: 39034880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130408448 unmapped: 39034880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130408448 unmapped: 39034880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130408448 unmapped: 39034880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130408448 unmapped: 39034880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130416640 unmapped: 39026688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130416640 unmapped: 39026688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130416640 unmapped: 39026688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130416640 unmapped: 39026688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130416640 unmapped: 39026688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130416640 unmapped: 39026688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130416640 unmapped: 39026688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130416640 unmapped: 39026688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130424832 unmapped: 39018496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130424832 unmapped: 39018496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130424832 unmapped: 39018496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130424832 unmapped: 39018496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130424832 unmapped: 39018496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130424832 unmapped: 39018496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130424832 unmapped: 39018496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130424832 unmapped: 39018496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130433024 unmapped: 39010304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130433024 unmapped: 39010304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130433024 unmapped: 39010304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130433024 unmapped: 39010304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130433024 unmapped: 39010304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130433024 unmapped: 39010304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130433024 unmapped: 39010304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130433024 unmapped: 39010304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130441216 unmapped: 39002112 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130441216 unmapped: 39002112 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130449408 unmapped: 38993920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130449408 unmapped: 38993920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130449408 unmapped: 38993920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130449408 unmapped: 38993920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130449408 unmapped: 38993920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130449408 unmapped: 38993920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130457600 unmapped: 38985728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130457600 unmapped: 38985728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130457600 unmapped: 38985728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130457600 unmapped: 38985728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130465792 unmapped: 38977536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130465792 unmapped: 38977536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130465792 unmapped: 38977536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130465792 unmapped: 38977536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130465792 unmapped: 38977536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130465792 unmapped: 38977536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130473984 unmapped: 38969344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130473984 unmapped: 38969344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130473984 unmapped: 38969344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130473984 unmapped: 38969344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130473984 unmapped: 38969344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130473984 unmapped: 38969344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130482176 unmapped: 38961152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130482176 unmapped: 38961152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130482176 unmapped: 38961152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130482176 unmapped: 38961152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130482176 unmapped: 38961152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130490368 unmapped: 38952960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130490368 unmapped: 38952960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130490368 unmapped: 38952960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130490368 unmapped: 38952960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130490368 unmapped: 38952960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130490368 unmapped: 38952960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: do_command 'config diff' '{prefix=config diff}'
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130531328 unmapped: 38912000 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: do_command 'config show' '{prefix=config show}'
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: do_command 'counter dump' '{prefix=counter dump}'
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: do_command 'counter schema' '{prefix=counter schema}'
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130572288 unmapped: 38871040 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130572288 unmapped: 38871040 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:18:46 np0005532763 ceph-osd[78269]: do_command 'log dump' '{prefix=log dump}'
Nov 23 16:18:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:18:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:18:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:18:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:18:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:47 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 23 16:18:47 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3778759182' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 23 16:18:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:47 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Nov 23 16:18:47 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2992208968' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 23 16:18:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:48.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:48 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Nov 23 16:18:48 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3147972838' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Nov 23 16:18:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:48 np0005532763 nova_compute[231311]: 2025-11-23 21:18:48.629 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:48.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Nov 23 16:18:49 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2988905069' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Nov 23 16:18:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Nov 23 16:18:49 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1452038472' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Nov 23 16:18:49 np0005532763 nova_compute[231311]: 2025-11-23 21:18:49.795 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Nov 23 16:18:49 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3953901098' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Nov 23 16:18:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:18:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:50.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:50 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Nov 23 16:18:50 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1699447290' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Nov 23 16:18:50 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Nov 23 16:18:50 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4104383779' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Nov 23 16:18:50 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Nov 23 16:18:50 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2946597723' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Nov 23 16:18:50 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Nov 23 16:18:50 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4095315572' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Nov 23 16:18:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:50.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:51 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Nov 23 16:18:51 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3275635344' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Nov 23 16:18:51 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Nov 23 16:18:51 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3131615339' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Nov 23 16:18:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:51 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Nov 23 16:18:51 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2339120237' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Nov 23 16:18:51 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Nov 23 16:18:51 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3724249338' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Nov 23 16:18:51 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Nov 23 16:18:51 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1438252920' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Nov 23 16:18:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:18:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:18:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:18:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:18:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:52.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:52 np0005532763 systemd[1]: Starting Hostname Service...
Nov 23 16:18:52 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Nov 23 16:18:52 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/953132378' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Nov 23 16:18:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:18:52.234 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:18:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:18:52.234 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:18:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:18:52.234 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:18:52 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Nov 23 16:18:52 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/400688937' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Nov 23 16:18:52 np0005532763 systemd[1]: Started Hostname Service.
Nov 23 16:18:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:52 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Nov 23 16:18:52 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3628178677' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Nov 23 16:18:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:52.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:53 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Nov 23 16:18:53 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/420032919' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Nov 23 16:18:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:53 np0005532763 nova_compute[231311]: 2025-11-23 21:18:53.630 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:53 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Nov 23 16:18:53 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1025556095' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Nov 23 16:18:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:54.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Nov 23 16:18:54 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/736443909' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Nov 23 16:18:54 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 16:18:54 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 16:18:54 np0005532763 nova_compute[231311]: 2025-11-23 21:18:54.798 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:54.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Nov 23 16:18:54 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2788619157' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Nov 23 16:18:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:18:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:55 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 16:18:55 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 16:18:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:55 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Nov 23 16:18:55 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1917580326' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Nov 23 16:18:55 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 16:18:55 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 16:18:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:56.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:56 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Nov 23 16:18:56 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3968254591' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Nov 23 16:18:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:18:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:56.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:18:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:18:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:18:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:18:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:18:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:18:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:57 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Nov 23 16:18:57 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/717847632' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Nov 23 16:18:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:57 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Nov 23 16:18:57 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1486092316' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Nov 23 16:18:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:18:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:18:58.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:18:58 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Nov 23 16:18:58 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1440736949' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Nov 23 16:18:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:58 np0005532763 podman[248460]: 2025-11-23 21:18:58.620227938 +0000 UTC m=+0.108906107 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 23 16:18:58 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Nov 23 16:18:58 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2565151772' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Nov 23 16:18:58 np0005532763 nova_compute[231311]: 2025-11-23 21:18:58.691 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:18:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:18:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:18:58.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:18:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:18:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:18:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:18:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Nov 23 16:18:59 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/268968875' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Nov 23 16:18:59 np0005532763 nova_compute[231311]: 2025-11-23 21:18:59.799 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:18:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:18:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Nov 23 16:18:59 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1664248280' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Nov 23 16:19:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:00.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:00 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Nov 23 16:19:00 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1102969640' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Nov 23 16:19:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:00.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:01 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Nov 23 16:19:01 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2929929060' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Nov 23 16:19:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:19:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:19:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:19:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:19:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:02.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:02 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Nov 23 16:19:02 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3128593780' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Nov 23 16:19:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:02.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:03 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Nov 23 16:19:03 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2236210916' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Nov 23 16:19:03 np0005532763 ovs-appctl[249541]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 23 16:19:03 np0005532763 ovs-appctl[249555]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 23 16:19:03 np0005532763 nova_compute[231311]: 2025-11-23 21:19:03.692 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:03 np0005532763 ovs-appctl[249562]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 23 16:19:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Nov 23 16:19:04 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2505412994' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Nov 23 16:19:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:04.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:04 np0005532763 nova_compute[231311]: 2025-11-23 21:19:04.802 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:04.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:19:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:05 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Nov 23 16:19:05 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2492804787' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Nov 23 16:19:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:05 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Nov 23 16:19:05 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3353732940' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Nov 23 16:19:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:06.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:06 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Nov 23 16:19:06 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/836800842' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Nov 23 16:19:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:06.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:19:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:19:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:19:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:19:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Nov 23 16:19:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1190473260' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Nov 23 16:19:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Nov 23 16:19:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1759951106' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Nov 23 16:19:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:19:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2717766197' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:19:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:19:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2717766197' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:19:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:08 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Nov 23 16:19:08 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3235159436' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Nov 23 16:19:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:08.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:08 np0005532763 nova_compute[231311]: 2025-11-23 21:19:08.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:19:08 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Nov 23 16:19:08 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1956868693' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Nov 23 16:19:08 np0005532763 nova_compute[231311]: 2025-11-23 21:19:08.694 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:08.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:09 np0005532763 podman[251160]: 2025-11-23 21:19:09.211332882 +0000 UTC m=+0.086191743 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 16:19:09 np0005532763 podman[251161]: 2025-11-23 21:19:09.239873547 +0000 UTC m=+0.113960457 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 16:19:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Nov 23 16:19:09 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1036166281' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Nov 23 16:19:09 np0005532763 nova_compute[231311]: 2025-11-23 21:19:09.805 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Nov 23 16:19:09 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/556149434' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Nov 23 16:19:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:19:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:19:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:10.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:19:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:10 np0005532763 nova_compute[231311]: 2025-11-23 21:19:10.378 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:19:10 np0005532763 nova_compute[231311]: 2025-11-23 21:19:10.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:19:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Nov 23 16:19:10 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2044575977' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Nov 23 16:19:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:10.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:11 np0005532763 nova_compute[231311]: 2025-11-23 21:19:11.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:19:11 np0005532763 nova_compute[231311]: 2025-11-23 21:19:11.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:19:11 np0005532763 nova_compute[231311]: 2025-11-23 21:19:11.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:19:11 np0005532763 nova_compute[231311]: 2025-11-23 21:19:11.396 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:19:11 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Nov 23 16:19:11 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2763860580' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Nov 23 16:19:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:19:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:19:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:19:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:19:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:12.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:12 np0005532763 nova_compute[231311]: 2025-11-23 21:19:12.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:19:12 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Nov 23 16:19:12 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2698622030' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Nov 23 16:19:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:12.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:13 np0005532763 nova_compute[231311]: 2025-11-23 21:19:13.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:19:13 np0005532763 nova_compute[231311]: 2025-11-23 21:19:13.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:19:13 np0005532763 nova_compute[231311]: 2025-11-23 21:19:13.382 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:19:13 np0005532763 nova_compute[231311]: 2025-11-23 21:19:13.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:19:13 np0005532763 nova_compute[231311]: 2025-11-23 21:19:13.410 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:19:13 np0005532763 nova_compute[231311]: 2025-11-23 21:19:13.410 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:19:13 np0005532763 nova_compute[231311]: 2025-11-23 21:19:13.410 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:19:13 np0005532763 nova_compute[231311]: 2025-11-23 21:19:13.410 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:19:13 np0005532763 nova_compute[231311]: 2025-11-23 21:19:13.411 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:19:13 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Nov 23 16:19:13 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3600430993' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Nov 23 16:19:13 np0005532763 nova_compute[231311]: 2025-11-23 21:19:13.698 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:13 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:19:13 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3257865756' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:19:13 np0005532763 nova_compute[231311]: 2025-11-23 21:19:13.925 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:19:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:14.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:14 np0005532763 nova_compute[231311]: 2025-11-23 21:19:14.121 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:19:14 np0005532763 nova_compute[231311]: 2025-11-23 21:19:14.122 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4533MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:19:14 np0005532763 nova_compute[231311]: 2025-11-23 21:19:14.122 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:19:14 np0005532763 nova_compute[231311]: 2025-11-23 21:19:14.123 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:19:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Nov 23 16:19:14 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3652618139' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Nov 23 16:19:14 np0005532763 nova_compute[231311]: 2025-11-23 21:19:14.179 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:19:14 np0005532763 nova_compute[231311]: 2025-11-23 21:19:14.182 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:19:14 np0005532763 nova_compute[231311]: 2025-11-23 21:19:14.200 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:19:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:19:14 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2289863147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:19:14 np0005532763 nova_compute[231311]: 2025-11-23 21:19:14.745 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.545s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:19:14 np0005532763 nova_compute[231311]: 2025-11-23 21:19:14.752 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:19:14 np0005532763 nova_compute[231311]: 2025-11-23 21:19:14.770 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:19:14 np0005532763 nova_compute[231311]: 2025-11-23 21:19:14.773 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:19:14 np0005532763 nova_compute[231311]: 2025-11-23 21:19:14.773 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.650s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:19:14 np0005532763 nova_compute[231311]: 2025-11-23 21:19:14.850 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:14.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:19:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Nov 23 16:19:15 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3398821895' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Nov 23 16:19:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Nov 23 16:19:15 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4268883983' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Nov 23 16:19:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:16.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:16 np0005532763 virtqemud[230850]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 23 16:19:16 np0005532763 nova_compute[231311]: 2025-11-23 21:19:16.778 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:19:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:16.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:19:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:19:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:19:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:19:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:18.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:18 np0005532763 nova_compute[231311]: 2025-11-23 21:19:18.701 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:18 np0005532763 systemd[1]: Starting Time & Date Service...
Nov 23 16:19:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:18.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:18 np0005532763 systemd[1]: Started Time & Date Service.
Nov 23 16:19:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:19 np0005532763 nova_compute[231311]: 2025-11-23 21:19:19.906 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:19:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:20.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:20.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:19:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:19:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:19:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:19:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:22.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:22.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:23 np0005532763 nova_compute[231311]: 2025-11-23 21:19:23.747 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:19:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:24.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:19:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:24.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:24 np0005532763 nova_compute[231311]: 2025-11-23 21:19:24.909 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:19:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:26.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:26.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:19:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:19:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:19:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:19:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:28.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:28 np0005532763 nova_compute[231311]: 2025-11-23 21:19:28.783 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:28.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:29 np0005532763 podman[252183]: 2025-11-23 21:19:29.242101926 +0000 UTC m=+0.105106507 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 16:19:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:29 np0005532763 nova_compute[231311]: 2025-11-23 21:19:29.938 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:19:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:30.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:30.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:19:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:19:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:19:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:19:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:32.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.003000084s ======
Nov 23 16:19:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:32.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000084s
Nov 23 16:19:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:33 np0005532763 nova_compute[231311]: 2025-11-23 21:19:33.786 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:34.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:19:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:34.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:19:34 np0005532763 nova_compute[231311]: 2025-11-23 21:19:34.940 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:19:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:36.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:36.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:19:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:19:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:19:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:19:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:38.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:38 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:19:38 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:19:38 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:19:38 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:19:38 np0005532763 nova_compute[231311]: 2025-11-23 21:19:38.792 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:38.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:39 np0005532763 nova_compute[231311]: 2025-11-23 21:19:39.944 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:19:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:40.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:40 np0005532763 podman[252294]: 2025-11-23 21:19:40.210089846 +0000 UTC m=+0.083615600 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 23 16:19:40 np0005532763 podman[252295]: 2025-11-23 21:19:40.255130037 +0000 UTC m=+0.126393837 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 16:19:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:40.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:19:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:19:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:19:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:19:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:42.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:42.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:43 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:19:43 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:19:43 np0005532763 nova_compute[231311]: 2025-11-23 21:19:43.793 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:19:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:44.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:19:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:44.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:44 np0005532763 nova_compute[231311]: 2025-11-23 21:19:44.945 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:19:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:46.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:46.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:19:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:19:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:19:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:19:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:48.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:48 np0005532763 nova_compute[231311]: 2025-11-23 21:19:48.797 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:48 np0005532763 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 23 16:19:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:48.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:48 np0005532763 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 23 16:19:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:49 np0005532763 nova_compute[231311]: 2025-11-23 21:19:49.948 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:19:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:50.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:50.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:19:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:19:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:19:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:19:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:52.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:19:52.235 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:19:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:19:52.236 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:19:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:19:52.236 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:19:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:52.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:53 np0005532763 nova_compute[231311]: 2025-11-23 21:19:53.800 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:54.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:19:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:54.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:19:54 np0005532763 nova_compute[231311]: 2025-11-23 21:19:54.951 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:19:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:56.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:56.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:19:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:19:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:19:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:19:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:19:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:19:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:19:58.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:19:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:58 np0005532763 nova_compute[231311]: 2025-11-23 21:19:58.838 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:19:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:19:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:19:58.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:19:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:19:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:19:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:19:59 np0005532763 podman[252411]: 2025-11-23 21:19:59.447718999 +0000 UTC m=+0.092713917 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:19:59 np0005532763 nova_compute[231311]: 2025-11-23 21:19:59.960 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:19:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:20:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:00.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:00 np0005532763 ceph-mon[75752]: Health detail: HEALTH_WARN 1 failed cephadm daemon(s)
Nov 23 16:20:00 np0005532763 ceph-mon[75752]: [WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s)
Nov 23 16:20:00 np0005532763 ceph-mon[75752]:    daemon nfs.cephfs.0.0.compute-1.fuxuha on compute-1 is in error state
Nov 23 16:20:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:00.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:20:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:20:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:20:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:20:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:02.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:02.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:03 np0005532763 nova_compute[231311]: 2025-11-23 21:20:03.841 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:04.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:04.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:04 np0005532763 nova_compute[231311]: 2025-11-23 21:20:04.962 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:20:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:05 np0005532763 systemd[1]: session-55.scope: Deactivated successfully.
Nov 23 16:20:05 np0005532763 systemd-logind[830]: Session 55 logged out. Waiting for processes to exit.
Nov 23 16:20:05 np0005532763 systemd[1]: session-55.scope: Consumed 3min 449ms CPU time, 741.1M memory peak, read 278.1M from disk, written 225.5M to disk.
Nov 23 16:20:05 np0005532763 systemd-logind[830]: Removed session 55.
Nov 23 16:20:05 np0005532763 systemd-logind[830]: New session 56 of user zuul.
Nov 23 16:20:06 np0005532763 systemd[1]: Started Session 56 of User zuul.
Nov 23 16:20:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:06.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:06 np0005532763 systemd[1]: session-56.scope: Deactivated successfully.
Nov 23 16:20:06 np0005532763 systemd-logind[830]: Session 56 logged out. Waiting for processes to exit.
Nov 23 16:20:06 np0005532763 systemd-logind[830]: Removed session 56.
Nov 23 16:20:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:06 np0005532763 systemd-logind[830]: New session 57 of user zuul.
Nov 23 16:20:06 np0005532763 systemd[1]: Started Session 57 of User zuul.
Nov 23 16:20:06 np0005532763 systemd[1]: session-57.scope: Deactivated successfully.
Nov 23 16:20:06 np0005532763 systemd-logind[830]: Session 57 logged out. Waiting for processes to exit.
Nov 23 16:20:06 np0005532763 systemd-logind[830]: Removed session 57.
Nov 23 16:20:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:20:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:06.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:20:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:20:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:20:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:20:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:20:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:20:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2272389192' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:20:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:20:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2272389192' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:20:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:08.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:08 np0005532763 nova_compute[231311]: 2025-11-23 21:20:08.893 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:08.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:09 np0005532763 nova_compute[231311]: 2025-11-23 21:20:09.380 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:20:09 np0005532763 nova_compute[231311]: 2025-11-23 21:20:09.396 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:20:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:20:09 np0005532763 nova_compute[231311]: 2025-11-23 21:20:09.995 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:10.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:10 np0005532763 nova_compute[231311]: 2025-11-23 21:20:10.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:20:10 np0005532763 nova_compute[231311]: 2025-11-23 21:20:10.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:20:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:10.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:11 np0005532763 podman[252525]: 2025-11-23 21:20:11.239627307 +0000 UTC m=+0.103986975 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 23 16:20:11 np0005532763 podman[252526]: 2025-11-23 21:20:11.286640534 +0000 UTC m=+0.151061854 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 16:20:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:20:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:20:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:20:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:20:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:12.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:12 np0005532763 nova_compute[231311]: 2025-11-23 21:20:12.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:20:12 np0005532763 nova_compute[231311]: 2025-11-23 21:20:12.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:20:12 np0005532763 nova_compute[231311]: 2025-11-23 21:20:12.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:20:12 np0005532763 nova_compute[231311]: 2025-11-23 21:20:12.399 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:20:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:12.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:13 np0005532763 nova_compute[231311]: 2025-11-23 21:20:13.896 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:14.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:14 np0005532763 nova_compute[231311]: 2025-11-23 21:20:14.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:20:14 np0005532763 nova_compute[231311]: 2025-11-23 21:20:14.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:20:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:14.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:20:14 np0005532763 nova_compute[231311]: 2025-11-23 21:20:14.997 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:15 np0005532763 nova_compute[231311]: 2025-11-23 21:20:15.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:20:15 np0005532763 nova_compute[231311]: 2025-11-23 21:20:15.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:20:15 np0005532763 nova_compute[231311]: 2025-11-23 21:20:15.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:20:15 np0005532763 nova_compute[231311]: 2025-11-23 21:20:15.417 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:20:15 np0005532763 nova_compute[231311]: 2025-11-23 21:20:15.418 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:20:15 np0005532763 nova_compute[231311]: 2025-11-23 21:20:15.418 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:20:15 np0005532763 nova_compute[231311]: 2025-11-23 21:20:15.418 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:20:15 np0005532763 nova_compute[231311]: 2025-11-23 21:20:15.419 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:20:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:20:15 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3456409841' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:20:15 np0005532763 nova_compute[231311]: 2025-11-23 21:20:15.904 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:20:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:16 np0005532763 nova_compute[231311]: 2025-11-23 21:20:16.148 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:20:16 np0005532763 nova_compute[231311]: 2025-11-23 21:20:16.150 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4789MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:20:16 np0005532763 nova_compute[231311]: 2025-11-23 21:20:16.151 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:20:16 np0005532763 nova_compute[231311]: 2025-11-23 21:20:16.151 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:20:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:16.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:16 np0005532763 nova_compute[231311]: 2025-11-23 21:20:16.251 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:20:16 np0005532763 nova_compute[231311]: 2025-11-23 21:20:16.251 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:20:16 np0005532763 nova_compute[231311]: 2025-11-23 21:20:16.273 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:20:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:16 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:20:16 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/761156875' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:20:16 np0005532763 nova_compute[231311]: 2025-11-23 21:20:16.800 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:20:16 np0005532763 nova_compute[231311]: 2025-11-23 21:20:16.809 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:20:16 np0005532763 nova_compute[231311]: 2025-11-23 21:20:16.822 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:20:16 np0005532763 nova_compute[231311]: 2025-11-23 21:20:16.825 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:20:16 np0005532763 nova_compute[231311]: 2025-11-23 21:20:16.825 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:20:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:16.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:20:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:20:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:20:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:20:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:17 np0005532763 nova_compute[231311]: 2025-11-23 21:20:17.825 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:20:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:18.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:18 np0005532763 nova_compute[231311]: 2025-11-23 21:20:18.943 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:18.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:20:20 np0005532763 nova_compute[231311]: 2025-11-23 21:20:20.030 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:20.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:20.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:20:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:20:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:20:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:20:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:22.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:22.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:23 np0005532763 nova_compute[231311]: 2025-11-23 21:20:23.988 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:24.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:24.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:20:25 np0005532763 nova_compute[231311]: 2025-11-23 21:20:25.059 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:26.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:26.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:20:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:20:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:20:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:20:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:20:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:28.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:20:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:28.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:29 np0005532763 nova_compute[231311]: 2025-11-23 21:20:29.023 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:29 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:20:30 np0005532763 nova_compute[231311]: 2025-11-23 21:20:30.061 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:30 np0005532763 podman[252659]: 2025-11-23 21:20:30.204524533 +0000 UTC m=+0.084509015 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 16:20:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:30.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:30.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:20:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:20:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:20:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:20:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:20:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:32.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:20:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:20:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:32.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:20:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:34 np0005532763 nova_compute[231311]: 2025-11-23 21:20:34.025 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:34.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:34 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:20:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:20:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:34.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:20:35 np0005532763 nova_compute[231311]: 2025-11-23 21:20:35.063 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:36.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:36.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:20:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:20:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:20:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:20:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:38.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:20:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:39.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:20:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:39 np0005532763 nova_compute[231311]: 2025-11-23 21:20:39.068 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:39 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:20:40 np0005532763 nova_compute[231311]: 2025-11-23 21:20:40.065 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:40.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:41.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:20:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:20:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:20:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:20:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:42 np0005532763 podman[252690]: 2025-11-23 21:20:42.2218438 +0000 UTC m=+0.092170321 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 23 16:20:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:42.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:42 np0005532763 podman[252691]: 2025-11-23 21:20:42.268526727 +0000 UTC m=+0.133516197 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller)
Nov 23 16:20:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:43.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:44 np0005532763 nova_compute[231311]: 2025-11-23 21:20:44.101 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:20:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:44.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:20:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:44 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:20:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:45.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:45 np0005532763 nova_compute[231311]: 2025-11-23 21:20:45.067 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:20:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:20:45.752173) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932845752296, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2642, "num_deletes": 508, "total_data_size": 5370871, "memory_usage": 5451328, "flush_reason": "Manual Compaction"}
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932845774461, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 3492192, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33831, "largest_seqno": 36468, "table_properties": {"data_size": 3481003, "index_size": 6403, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3717, "raw_key_size": 30131, "raw_average_key_size": 20, "raw_value_size": 3455012, "raw_average_value_size": 2350, "num_data_blocks": 274, "num_entries": 1470, "num_filter_entries": 1470, "num_deletions": 508, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763932684, "oldest_key_time": 1763932684, "file_creation_time": 1763932845, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 22374 microseconds, and 13385 cpu microseconds.
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:20:45.774549) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 3492192 bytes OK
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:20:45.774588) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:20:45.776881) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:20:45.776942) EVENT_LOG_v1 {"time_micros": 1763932845776925, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:20:45.776973) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 5357341, prev total WAL file size 5357341, number of live WAL files 2.
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:20:45.779665) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323530' seq:72057594037927935, type:22 .. '6B7600353033' seq:0, type:0; will stop at (end)
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(3410KB)], [63(13MB)]
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932845779736, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 18098201, "oldest_snapshot_seqno": -1}
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6670 keys, 16619813 bytes, temperature: kUnknown
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932845863937, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 16619813, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16573382, "index_size": 28655, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16709, "raw_key_size": 173990, "raw_average_key_size": 26, "raw_value_size": 16451558, "raw_average_value_size": 2466, "num_data_blocks": 1141, "num_entries": 6670, "num_filter_entries": 6670, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763932845, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:20:45.864369) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 16619813 bytes
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:20:45.866354) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 214.7 rd, 197.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.3, 13.9 +0.0 blob) out(15.8 +0.0 blob), read-write-amplify(9.9) write-amplify(4.8) OK, records in: 7703, records dropped: 1033 output_compression: NoCompression
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:20:45.866387) EVENT_LOG_v1 {"time_micros": 1763932845866371, "job": 38, "event": "compaction_finished", "compaction_time_micros": 84300, "compaction_time_cpu_micros": 56208, "output_level": 6, "num_output_files": 1, "total_output_size": 16619813, "num_input_records": 7703, "num_output_records": 6670, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932845867550, "job": 38, "event": "table_file_deletion", "file_number": 65}
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932845873023, "job": 38, "event": "table_file_deletion", "file_number": 63}
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:20:45.779534) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:20:45.873163) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:20:45.873173) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:20:45.873177) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:20:45.873181) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:20:45 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:20:45.873185) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:20:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:46.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:20:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:20:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:20:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:20:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:47.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:48.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:49.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:49 np0005532763 nova_compute[231311]: 2025-11-23 21:20:49.103 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:49 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:20:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:50 np0005532763 nova_compute[231311]: 2025-11-23 21:20:50.069 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:50.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:50 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:20:50 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:20:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:51.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:20:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:20:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:20:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:20:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:20:52.236 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:20:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:20:52.237 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:20:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:20:52.237 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:20:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:52.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:53.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:54 np0005532763 nova_compute[231311]: 2025-11-23 21:20:54.121 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:54.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:54 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:20:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:55.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:55 np0005532763 nova_compute[231311]: 2025-11-23 21:20:55.071 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:20:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:56.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:20:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:20:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:20:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:20:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:20:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:20:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:57.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:20:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:20:58.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:20:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:20:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:20:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:20:59.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:20:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:20:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:59 np0005532763 nova_compute[231311]: 2025-11-23 21:20:59.124 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:20:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:20:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:20:59 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:21:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:00 np0005532763 nova_compute[231311]: 2025-11-23 21:21:00.072 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:00.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:01.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:01 np0005532763 podman[252887]: 2025-11-23 21:21:01.213479415 +0000 UTC m=+0.088724335 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 16:21:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:21:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:21:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:21:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:21:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:02.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:03.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:04 np0005532763 nova_compute[231311]: 2025-11-23 21:21:04.126 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:04.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:04 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 16:21:04 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 7082 writes, 36K keys, 7082 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s#012Cumulative WAL: 7082 writes, 7082 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1610 writes, 8418 keys, 1610 commit groups, 1.0 writes per commit group, ingest: 17.95 MB, 0.03 MB/s#012Interval WAL: 1610 writes, 1610 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0    150.6      0.37              0.22        19    0.019       0      0       0.0       0.0#012  L6      1/0   15.85 MB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   4.4    186.0    159.9      1.52              0.86        18    0.084    101K    10K       0.0       0.0#012 Sum      1/0   15.85 MB   0.0      0.3     0.1      0.2       0.3      0.1       0.0   5.4    149.7    158.1      1.88              1.08        37    0.051    101K    10K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.9    167.6    174.3      0.50              0.34        10    0.050     34K   3616       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   0.0    186.0    159.9      1.52              0.86        18    0.084    101K    10K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0    151.6      0.37              0.22        18    0.020       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.054, interval 0.015#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.29 GB write, 0.12 MB/s write, 0.28 GB read, 0.12 MB/s read, 1.9 seconds#012Interval compaction: 0.09 GB write, 0.15 MB/s write, 0.08 GB read, 0.14 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e7d0d09350#2 capacity: 304.00 MB usage: 24.06 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000212 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1472,23.26 MB,7.65114%) FilterBlock(37,298.73 KB,0.0959647%) IndexBlock(37,520.92 KB,0.16734%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Nov 23 16:21:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:21:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:05.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:05 np0005532763 nova_compute[231311]: 2025-11-23 21:21:05.101 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:21:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:06.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:21:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:21:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:21:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:21:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:21:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:07.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:08.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:09.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:09 np0005532763 nova_compute[231311]: 2025-11-23 21:21:09.161 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:21:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:10 np0005532763 nova_compute[231311]: 2025-11-23 21:21:10.103 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:10.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:10 np0005532763 nova_compute[231311]: 2025-11-23 21:21:10.379 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:21:10 np0005532763 nova_compute[231311]: 2025-11-23 21:21:10.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:21:10 np0005532763 nova_compute[231311]: 2025-11-23 21:21:10.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:21:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:11.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:21:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:21:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:21:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:21:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:12.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:21:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:13.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:21:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:13 np0005532763 podman[252947]: 2025-11-23 21:21:13.229730952 +0000 UTC m=+0.101549906 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:21:13 np0005532763 podman[252948]: 2025-11-23 21:21:13.264097142 +0000 UTC m=+0.136711759 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 23 16:21:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:13 np0005532763 nova_compute[231311]: 2025-11-23 21:21:13.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:21:13 np0005532763 nova_compute[231311]: 2025-11-23 21:21:13.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:21:13 np0005532763 nova_compute[231311]: 2025-11-23 21:21:13.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:21:13 np0005532763 nova_compute[231311]: 2025-11-23 21:21:13.402 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:21:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:14 np0005532763 nova_compute[231311]: 2025-11-23 21:21:14.163 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:14.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:21:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:15.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:15 np0005532763 nova_compute[231311]: 2025-11-23 21:21:15.143 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:15 np0005532763 nova_compute[231311]: 2025-11-23 21:21:15.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:21:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:16.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:16 np0005532763 nova_compute[231311]: 2025-11-23 21:21:16.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:21:16 np0005532763 nova_compute[231311]: 2025-11-23 21:21:16.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:21:16 np0005532763 nova_compute[231311]: 2025-11-23 21:21:16.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:21:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:21:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:21:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:21:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:21:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:17.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:17 np0005532763 nova_compute[231311]: 2025-11-23 21:21:17.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:21:17 np0005532763 nova_compute[231311]: 2025-11-23 21:21:17.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:21:17 np0005532763 nova_compute[231311]: 2025-11-23 21:21:17.414 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:21:17 np0005532763 nova_compute[231311]: 2025-11-23 21:21:17.415 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:21:17 np0005532763 nova_compute[231311]: 2025-11-23 21:21:17.415 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:21:17 np0005532763 nova_compute[231311]: 2025-11-23 21:21:17.415 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:21:17 np0005532763 nova_compute[231311]: 2025-11-23 21:21:17.416 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:21:17 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:21:17 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1849403323' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:21:17 np0005532763 nova_compute[231311]: 2025-11-23 21:21:17.915 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:21:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:18 np0005532763 nova_compute[231311]: 2025-11-23 21:21:18.177 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:21:18 np0005532763 nova_compute[231311]: 2025-11-23 21:21:18.178 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4828MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:21:18 np0005532763 nova_compute[231311]: 2025-11-23 21:21:18.179 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:21:18 np0005532763 nova_compute[231311]: 2025-11-23 21:21:18.179 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:21:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:18.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:18 np0005532763 nova_compute[231311]: 2025-11-23 21:21:18.397 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:21:18 np0005532763 nova_compute[231311]: 2025-11-23 21:21:18.397 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:21:18 np0005532763 nova_compute[231311]: 2025-11-23 21:21:18.424 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:21:18 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:21:18 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3325004431' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:21:18 np0005532763 nova_compute[231311]: 2025-11-23 21:21:18.914 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:21:18 np0005532763 nova_compute[231311]: 2025-11-23 21:21:18.921 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:21:18 np0005532763 nova_compute[231311]: 2025-11-23 21:21:18.952 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:21:18 np0005532763 nova_compute[231311]: 2025-11-23 21:21:18.954 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:21:18 np0005532763 nova_compute[231311]: 2025-11-23 21:21:18.955 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:21:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:19.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:19 np0005532763 nova_compute[231311]: 2025-11-23 21:21:19.166 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:21:19.861693) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932879861774, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 582, "num_deletes": 251, "total_data_size": 1037271, "memory_usage": 1048008, "flush_reason": "Manual Compaction"}
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932879869868, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 682582, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36473, "largest_seqno": 37050, "table_properties": {"data_size": 679529, "index_size": 1025, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7144, "raw_average_key_size": 19, "raw_value_size": 673450, "raw_average_value_size": 1815, "num_data_blocks": 44, "num_entries": 371, "num_filter_entries": 371, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763932846, "oldest_key_time": 1763932846, "file_creation_time": 1763932879, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 9223 microseconds, and 5623 cpu microseconds.
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:21:19.870041) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 682582 bytes OK
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:21:19.871208) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:21:19.873114) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:21:19.873169) EVENT_LOG_v1 {"time_micros": 1763932879873153, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:21:19.873201) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 1033966, prev total WAL file size 1033966, number of live WAL files 2.
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:21:19.874692) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(666KB)], [66(15MB)]
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932879874750, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 17302395, "oldest_snapshot_seqno": -1}
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6527 keys, 15181308 bytes, temperature: kUnknown
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932879953753, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 15181308, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15137001, "index_size": 26917, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16325, "raw_key_size": 171723, "raw_average_key_size": 26, "raw_value_size": 15018688, "raw_average_value_size": 2301, "num_data_blocks": 1063, "num_entries": 6527, "num_filter_entries": 6527, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763932879, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:21:19.954132) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 15181308 bytes
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:21:19.955730) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 218.6 rd, 191.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 15.8 +0.0 blob) out(14.5 +0.0 blob), read-write-amplify(47.6) write-amplify(22.2) OK, records in: 7041, records dropped: 514 output_compression: NoCompression
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:21:19.955762) EVENT_LOG_v1 {"time_micros": 1763932879955748, "job": 40, "event": "compaction_finished", "compaction_time_micros": 79145, "compaction_time_cpu_micros": 50290, "output_level": 6, "num_output_files": 1, "total_output_size": 15181308, "num_input_records": 7041, "num_output_records": 6527, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932879956446, "job": 40, "event": "table_file_deletion", "file_number": 68}
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932879961879, "job": 40, "event": "table_file_deletion", "file_number": 66}
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:21:19.874553) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:21:19.961975) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:21:19.961981) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:21:19.961984) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:21:19.961987) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:21:19 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:21:19.961990) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:21:20 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:21:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:20 np0005532763 nova_compute[231311]: 2025-11-23 21:21:20.176 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:20.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:21.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:21:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:21:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:21:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:21:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:22.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:23.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:24 np0005532763 nova_compute[231311]: 2025-11-23 21:21:24.167 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:24.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:25 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:21:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:25.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:25 np0005532763 nova_compute[231311]: 2025-11-23 21:21:25.221 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:21:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:26.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:21:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:21:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:21:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:21:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:21:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:27.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:28.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:29.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:29 np0005532763 nova_compute[231311]: 2025-11-23 21:21:29.170 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:30 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:21:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:30 np0005532763 nova_compute[231311]: 2025-11-23 21:21:30.255 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:30.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:31.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:21:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:21:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:21:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:21:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:32 np0005532763 podman[253082]: 2025-11-23 21:21:32.211716903 +0000 UTC m=+0.089744033 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 23 16:21:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:32.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:33.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:34 np0005532763 nova_compute[231311]: 2025-11-23 21:21:34.214 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:34.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:35 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:21:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:35.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:35 np0005532763 nova_compute[231311]: 2025-11-23 21:21:35.288 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:36.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:21:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:21:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:21:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:21:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:37.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:38.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:39.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:39 np0005532763 nova_compute[231311]: 2025-11-23 21:21:39.216 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 16:21:39 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 10K writes, 42K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 10K writes, 3067 syncs, 3.50 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2228 writes, 8044 keys, 2228 commit groups, 1.0 writes per commit group, ingest: 8.01 MB, 0.01 MB/s#012Interval WAL: 2228 writes, 922 syncs, 2.42 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 16:21:40 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:21:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:40.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:40 np0005532763 nova_compute[231311]: 2025-11-23 21:21:40.341 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:41.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:21:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:21:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:21:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:21:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:42.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:43.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:44 np0005532763 podman[253113]: 2025-11-23 21:21:44.214631283 +0000 UTC m=+0.095154486 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd)
Nov 23 16:21:44 np0005532763 nova_compute[231311]: 2025-11-23 21:21:44.220 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:44 np0005532763 podman[253114]: 2025-11-23 21:21:44.286598143 +0000 UTC m=+0.162387953 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Nov 23 16:21:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:44.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:45 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:21:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:45.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:45 np0005532763 nova_compute[231311]: 2025-11-23 21:21:45.343 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:46.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:21:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:21:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:21:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:21:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:47.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:48.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:21:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:49.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:21:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:49 np0005532763 nova_compute[231311]: 2025-11-23 21:21:49.221 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:50 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:21:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:50.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:50 np0005532763 nova_compute[231311]: 2025-11-23 21:21:50.382 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:51.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:51 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:21:51 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:21:51 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:21:51 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:21:51 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:21:51 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:21:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:21:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:21:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:21:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:21:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:21:52.237 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:21:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:21:52.238 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:21:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:21:52.238 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:21:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:52.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:53.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:54 np0005532763 nova_compute[231311]: 2025-11-23 21:21:54.223 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:54.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:55 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:21:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:55.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:55 np0005532763 nova_compute[231311]: 2025-11-23 21:21:55.422 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:21:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:56.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:21:56 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:21:56 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:21:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:21:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:21:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:21:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:21:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:21:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:57.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:21:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:21:58.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:21:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:21:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:21:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:21:59.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:21:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:21:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:21:59 np0005532763 nova_compute[231311]: 2025-11-23 21:21:59.225 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:21:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:21:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:00 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:22:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:00.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:00 np0005532763 nova_compute[231311]: 2025-11-23 21:22:00.463 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:22:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:01.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:22:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:22:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:22:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:22:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:22:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:02.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:03.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:03 np0005532763 podman[253382]: 2025-11-23 21:22:03.194406194 +0000 UTC m=+0.073864345 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 23 16:22:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:04 np0005532763 nova_compute[231311]: 2025-11-23 21:22:04.249 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:04.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:05 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:22:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:05.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:05 np0005532763 nova_compute[231311]: 2025-11-23 21:22:05.516 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:06.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:22:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:22:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:22:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:22:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:07.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:22:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1782666011' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:22:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:22:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1782666011' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:22:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.004000112s ======
Nov 23 16:22:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:08.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000112s
Nov 23 16:22:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:09.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:09 np0005532763 nova_compute[231311]: 2025-11-23 21:22:09.292 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:22:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:10.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:10 np0005532763 nova_compute[231311]: 2025-11-23 21:22:10.549 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:11.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:11 np0005532763 nova_compute[231311]: 2025-11-23 21:22:11.951 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:22:11 np0005532763 nova_compute[231311]: 2025-11-23 21:22:11.952 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:22:11 np0005532763 nova_compute[231311]: 2025-11-23 21:22:11.965 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:22:11 np0005532763 nova_compute[231311]: 2025-11-23 21:22:11.966 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:22:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:22:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:22:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:22:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:22:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:12.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:13.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:14 np0005532763 nova_compute[231311]: 2025-11-23 21:22:14.303 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:14 np0005532763 nova_compute[231311]: 2025-11-23 21:22:14.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:22:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:14.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:22:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:22:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:15.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:22:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:15 np0005532763 podman[253439]: 2025-11-23 21:22:15.21808223 +0000 UTC m=+0.093777447 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 23 16:22:15 np0005532763 podman[253440]: 2025-11-23 21:22:15.261132734 +0000 UTC m=+0.124408070 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 23 16:22:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:15 np0005532763 nova_compute[231311]: 2025-11-23 21:22:15.401 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:22:15 np0005532763 nova_compute[231311]: 2025-11-23 21:22:15.402 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:22:15 np0005532763 nova_compute[231311]: 2025-11-23 21:22:15.402 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:22:15 np0005532763 nova_compute[231311]: 2025-11-23 21:22:15.418 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:22:15 np0005532763 nova_compute[231311]: 2025-11-23 21:22:15.586 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:16.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:22:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:22:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:22:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:22:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:17.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:17 np0005532763 nova_compute[231311]: 2025-11-23 21:22:17.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:22:17 np0005532763 nova_compute[231311]: 2025-11-23 21:22:17.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:22:17 np0005532763 nova_compute[231311]: 2025-11-23 21:22:17.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:22:17 np0005532763 nova_compute[231311]: 2025-11-23 21:22:17.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:22:17 np0005532763 nova_compute[231311]: 2025-11-23 21:22:17.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 23 16:22:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:18 np0005532763 nova_compute[231311]: 2025-11-23 21:22:18.394 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:22:18 np0005532763 nova_compute[231311]: 2025-11-23 21:22:18.395 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:22:18 np0005532763 nova_compute[231311]: 2025-11-23 21:22:18.420 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:22:18 np0005532763 nova_compute[231311]: 2025-11-23 21:22:18.420 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:22:18 np0005532763 nova_compute[231311]: 2025-11-23 21:22:18.421 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:22:18 np0005532763 nova_compute[231311]: 2025-11-23 21:22:18.421 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:22:18 np0005532763 nova_compute[231311]: 2025-11-23 21:22:18.421 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:22:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:18.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:18 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:22:18 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3698910792' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:22:18 np0005532763 nova_compute[231311]: 2025-11-23 21:22:18.867 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:22:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:19.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:19 np0005532763 nova_compute[231311]: 2025-11-23 21:22:19.130 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:22:19 np0005532763 nova_compute[231311]: 2025-11-23 21:22:19.132 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4808MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:22:19 np0005532763 nova_compute[231311]: 2025-11-23 21:22:19.132 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:22:19 np0005532763 nova_compute[231311]: 2025-11-23 21:22:19.132 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:22:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:19 np0005532763 nova_compute[231311]: 2025-11-23 21:22:19.222 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:22:19 np0005532763 nova_compute[231311]: 2025-11-23 21:22:19.223 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:22:19 np0005532763 nova_compute[231311]: 2025-11-23 21:22:19.293 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:22:19 np0005532763 nova_compute[231311]: 2025-11-23 21:22:19.345 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:22:19 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1267239501' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:22:19 np0005532763 nova_compute[231311]: 2025-11-23 21:22:19.799 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:22:19 np0005532763 nova_compute[231311]: 2025-11-23 21:22:19.805 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:22:19 np0005532763 nova_compute[231311]: 2025-11-23 21:22:19.822 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:22:19 np0005532763 nova_compute[231311]: 2025-11-23 21:22:19.825 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:22:19 np0005532763 nova_compute[231311]: 2025-11-23 21:22:19.825 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.693s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:22:20 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:22:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:20.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:20 np0005532763 nova_compute[231311]: 2025-11-23 21:22:20.639 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:20 np0005532763 nova_compute[231311]: 2025-11-23 21:22:20.814 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:22:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:21.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:22:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:22:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:22:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:22:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:22 np0005532763 nova_compute[231311]: 2025-11-23 21:22:22.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:22:22 np0005532763 nova_compute[231311]: 2025-11-23 21:22:22.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 23 16:22:22 np0005532763 nova_compute[231311]: 2025-11-23 21:22:22.403 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 23 16:22:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:22.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:22:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:23.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:22:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:24 np0005532763 nova_compute[231311]: 2025-11-23 21:22:24.346 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:24.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:25 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:22:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:25.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:25 np0005532763 nova_compute[231311]: 2025-11-23 21:22:25.685 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:26.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:22:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:22:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:22:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:22:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:27.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:28.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:29.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:29 np0005532763 nova_compute[231311]: 2025-11-23 21:22:29.391 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:22:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:30.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:30 np0005532763 nova_compute[231311]: 2025-11-23 21:22:30.726 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:22:30.829090) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932950829138, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 962, "num_deletes": 250, "total_data_size": 2086025, "memory_usage": 2116912, "flush_reason": "Manual Compaction"}
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932950837111, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 901115, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37055, "largest_seqno": 38012, "table_properties": {"data_size": 897476, "index_size": 1355, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9792, "raw_average_key_size": 20, "raw_value_size": 889693, "raw_average_value_size": 1901, "num_data_blocks": 58, "num_entries": 468, "num_filter_entries": 468, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763932880, "oldest_key_time": 1763932880, "file_creation_time": 1763932950, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 8105 microseconds, and 4311 cpu microseconds.
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:22:30.837190) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 901115 bytes OK
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:22:30.837224) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:22:30.838951) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:22:30.838975) EVENT_LOG_v1 {"time_micros": 1763932950838967, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:22:30.838999) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 2081249, prev total WAL file size 2081249, number of live WAL files 2.
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:22:30.840149) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303036' seq:72057594037927935, type:22 .. '6D6772737461740031323537' seq:0, type:0; will stop at (end)
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(879KB)], [69(14MB)]
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932950840218, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 16082423, "oldest_snapshot_seqno": -1}
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6508 keys, 12457179 bytes, temperature: kUnknown
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932950911667, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 12457179, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12416877, "index_size": 22912, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16325, "raw_key_size": 171509, "raw_average_key_size": 26, "raw_value_size": 12302782, "raw_average_value_size": 1890, "num_data_blocks": 897, "num_entries": 6508, "num_filter_entries": 6508, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763932950, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:22:30.912009) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 12457179 bytes
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:22:30.913369) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 224.8 rd, 174.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 14.5 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(31.7) write-amplify(13.8) OK, records in: 6995, records dropped: 487 output_compression: NoCompression
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:22:30.913399) EVENT_LOG_v1 {"time_micros": 1763932950913385, "job": 42, "event": "compaction_finished", "compaction_time_micros": 71544, "compaction_time_cpu_micros": 53323, "output_level": 6, "num_output_files": 1, "total_output_size": 12457179, "num_input_records": 6995, "num_output_records": 6508, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932950913819, "job": 42, "event": "table_file_deletion", "file_number": 71}
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763932950918750, "job": 42, "event": "table_file_deletion", "file_number": 69}
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:22:30.840044) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:22:30.918899) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:22:30.918908) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:22:30.918913) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:22:30.918918) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:22:30 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:22:30.918922) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:22:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:31.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:22:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:22:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:22:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:22:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:32.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:33.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:34 np0005532763 podman[253574]: 2025-11-23 21:22:34.208259451 +0000 UTC m=+0.085965566 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 16:22:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:34 np0005532763 nova_compute[231311]: 2025-11-23 21:22:34.394 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:34.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:35 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:22:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:35.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:35 np0005532763 nova_compute[231311]: 2025-11-23 21:22:35.767 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:36.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:22:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:22:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:22:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:22:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:37.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:38.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:39.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:39 np0005532763 nova_compute[231311]: 2025-11-23 21:22:39.397 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:40 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:22:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:40.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:40 np0005532763 nova_compute[231311]: 2025-11-23 21:22:40.809 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:22:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:41.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:22:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:22:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:22:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:22:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:22:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:42.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:43.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:44 np0005532763 nova_compute[231311]: 2025-11-23 21:22:44.400 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:44.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:45 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:22:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:45.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:45 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Nov 23 16:22:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:45 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Nov 23 16:22:45 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Nov 23 16:22:45 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Nov 23 16:22:45 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Nov 23 16:22:45 np0005532763 radosgw[84112]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Nov 23 16:22:45 np0005532763 nova_compute[231311]: 2025-11-23 21:22:45.846 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:46 np0005532763 podman[253605]: 2025-11-23 21:22:46.226225767 +0000 UTC m=+0.093255172 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 16:22:46 np0005532763 podman[253606]: 2025-11-23 21:22:46.272292487 +0000 UTC m=+0.135913356 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller)
Nov 23 16:22:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:46.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:22:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:22:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:22:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:22:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:47.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:48.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:49.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:49 np0005532763 nova_compute[231311]: 2025-11-23 21:22:49.404 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:50 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:22:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:50 np0005532763 nova_compute[231311]: 2025-11-23 21:22:50.300 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:22:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:50.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:50 np0005532763 nova_compute[231311]: 2025-11-23 21:22:50.880 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:22:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:51.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:22:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:22:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:22:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:22:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:22:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:22:52.238 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:22:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:22:52.238 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:22:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:22:52.238 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:22:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:52.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:53.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:54 np0005532763 nova_compute[231311]: 2025-11-23 21:22:54.405 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:54.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:55 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:22:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:55.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:55 np0005532763 nova_compute[231311]: 2025-11-23 21:22:55.883 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:22:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:56.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:22:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:22:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:22:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:22:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:22:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:22:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:57.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:22:57 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:22:57 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:22:57 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:22:57 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:22:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000057s ======
Nov 23 16:22:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:22:58.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Nov 23 16:22:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:22:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:22:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:22:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:22:59.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:22:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:22:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:22:59 np0005532763 nova_compute[231311]: 2025-11-23 21:22:59.407 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:00 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:23:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:00.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:00 np0005532763 nova_compute[231311]: 2025-11-23 21:23:00.919 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:23:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:01.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:23:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:23:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:23:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:23:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:23:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:02 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:23:02 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:23:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:23:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:02.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:23:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:23:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:03.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:23:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:04 np0005532763 nova_compute[231311]: 2025-11-23 21:23:04.410 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:23:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:04.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:23:05 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:23:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:05.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:05 np0005532763 podman[253801]: 2025-11-23 21:23:05.213825465 +0000 UTC m=+0.090191596 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 16:23:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:05 np0005532763 nova_compute[231311]: 2025-11-23 21:23:05.922 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:06.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:23:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:23:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:23:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:23:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:07.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:08.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:09.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:09 np0005532763 nova_compute[231311]: 2025-11-23 21:23:09.511 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:23:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:10.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:10 np0005532763 nova_compute[231311]: 2025-11-23 21:23:10.962 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:11 np0005532763 nova_compute[231311]: 2025-11-23 21:23:11.397 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:23:11 np0005532763 nova_compute[231311]: 2025-11-23 21:23:11.398 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:23:11 np0005532763 nova_compute[231311]: 2025-11-23 21:23:11.398 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:23:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:11.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:23:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:23:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:23:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:23:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:12.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:13.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:14 np0005532763 nova_compute[231311]: 2025-11-23 21:23:14.513 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:23:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:14.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:23:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:23:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:15.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:16 np0005532763 nova_compute[231311]: 2025-11-23 21:23:16.003 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:16.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:23:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:23:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:23:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:23:17 np0005532763 nova_compute[231311]: 2025-11-23 21:23:17.083 231315 DEBUG oslo_concurrency.processutils [None req-d89b790b-8376-465b-8448-23090b964ac1 8c34b8adab3049c9b4e37e075333da23 3f8fb5175f85402ba20cf9c6989d47cf - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:23:17 np0005532763 nova_compute[231311]: 2025-11-23 21:23:17.131 231315 DEBUG oslo_concurrency.processutils [None req-d89b790b-8376-465b-8448-23090b964ac1 8c34b8adab3049c9b4e37e075333da23 3f8fb5175f85402ba20cf9c6989d47cf - - default default] CMD "env LANG=C uptime" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:23:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:17 np0005532763 podman[253858]: 2025-11-23 21:23:17.23073839 +0000 UTC m=+0.103819720 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 16:23:17 np0005532763 podman[253859]: 2025-11-23 21:23:17.259813571 +0000 UTC m=+0.125997367 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 23 16:23:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:17 np0005532763 nova_compute[231311]: 2025-11-23 21:23:17.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:23:17 np0005532763 nova_compute[231311]: 2025-11-23 21:23:17.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:23:17 np0005532763 nova_compute[231311]: 2025-11-23 21:23:17.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:23:17 np0005532763 nova_compute[231311]: 2025-11-23 21:23:17.405 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:23:17 np0005532763 nova_compute[231311]: 2025-11-23 21:23:17.406 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:23:17 np0005532763 nova_compute[231311]: 2025-11-23 21:23:17.407 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:23:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:17.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:18 np0005532763 nova_compute[231311]: 2025-11-23 21:23:18.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:23:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:18.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:19.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:19 np0005532763 nova_compute[231311]: 2025-11-23 21:23:19.548 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:20 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:23:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:20 np0005532763 nova_compute[231311]: 2025-11-23 21:23:20.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:23:20 np0005532763 nova_compute[231311]: 2025-11-23 21:23:20.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:23:20 np0005532763 nova_compute[231311]: 2025-11-23 21:23:20.410 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:23:20 np0005532763 nova_compute[231311]: 2025-11-23 21:23:20.411 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:23:20 np0005532763 nova_compute[231311]: 2025-11-23 21:23:20.411 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:23:20 np0005532763 nova_compute[231311]: 2025-11-23 21:23:20.411 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:23:20 np0005532763 nova_compute[231311]: 2025-11-23 21:23:20.412 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:23:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:23:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:20.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:23:20 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:23:20 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3527708452' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:23:20 np0005532763 nova_compute[231311]: 2025-11-23 21:23:20.875 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:23:21 np0005532763 nova_compute[231311]: 2025-11-23 21:23:21.040 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:21 np0005532763 nova_compute[231311]: 2025-11-23 21:23:21.161 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:23:21 np0005532763 nova_compute[231311]: 2025-11-23 21:23:21.162 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4818MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:23:21 np0005532763 nova_compute[231311]: 2025-11-23 21:23:21.163 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:23:21 np0005532763 nova_compute[231311]: 2025-11-23 21:23:21.163 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:23:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:21 np0005532763 nova_compute[231311]: 2025-11-23 21:23:21.235 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:23:21 np0005532763 nova_compute[231311]: 2025-11-23 21:23:21.236 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:23:21 np0005532763 nova_compute[231311]: 2025-11-23 21:23:21.257 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Refreshing inventories for resource provider 20c32e0a-de2c-427c-9273-fac11e2660f4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 23 16:23:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:21 np0005532763 nova_compute[231311]: 2025-11-23 21:23:21.409 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Updating ProviderTree inventory for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 23 16:23:21 np0005532763 nova_compute[231311]: 2025-11-23 21:23:21.410 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Updating inventory in ProviderTree for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 23 16:23:21 np0005532763 nova_compute[231311]: 2025-11-23 21:23:21.436 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Refreshing aggregate associations for resource provider 20c32e0a-de2c-427c-9273-fac11e2660f4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 23 16:23:21 np0005532763 nova_compute[231311]: 2025-11-23 21:23:21.483 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Refreshing trait associations for resource provider 20c32e0a-de2c-427c-9273-fac11e2660f4, traits: COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE2,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_AVX2,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,HW_CPU_X86_F16C,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_FMA3,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE,HW_CPU_X86_SVM,HW_CPU_X86_BMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 23 16:23:21 np0005532763 nova_compute[231311]: 2025-11-23 21:23:21.502 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:23:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:21.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:21 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:23:21 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3272348093' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:23:21 np0005532763 nova_compute[231311]: 2025-11-23 21:23:21.994 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:23:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:23:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:23:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:23:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:23:22 np0005532763 nova_compute[231311]: 2025-11-23 21:23:22.002 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:23:22 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:23:22.007 142920 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '3a:26:f0', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8a:d5:4d:db:d5:2b'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 23 16:23:22 np0005532763 nova_compute[231311]: 2025-11-23 21:23:22.008 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:22 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:23:22.009 142920 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 23 16:23:22 np0005532763 nova_compute[231311]: 2025-11-23 21:23:22.033 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:23:22 np0005532763 nova_compute[231311]: 2025-11-23 21:23:22.036 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:23:22 np0005532763 nova_compute[231311]: 2025-11-23 21:23:22.036 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.873s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:23:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:23:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:22.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:23:23 np0005532763 nova_compute[231311]: 2025-11-23 21:23:23.037 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:23:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:23:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:23.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:23:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:24 np0005532763 nova_compute[231311]: 2025-11-23 21:23:24.550 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:24.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:25 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:23:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:25.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:26 np0005532763 nova_compute[231311]: 2025-11-23 21:23:26.043 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:23:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:26.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:23:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:23:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:23:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:23:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:23:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:23:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:27.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:23:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:28.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:23:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:29.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:23:29 np0005532763 nova_compute[231311]: 2025-11-23 21:23:29.553 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:30 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:23:30.012 142920 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10e3bf57-dd2d-4b94-851f-925bcd297dde, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 23 16:23:30 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:23:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:23:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:30.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:23:31 np0005532763 nova_compute[231311]: 2025-11-23 21:23:31.076 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:23:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:31.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:23:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:23:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:23:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:23:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:23:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:32.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:23:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:33.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:23:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:23:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:34.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:23:34 np0005532763 nova_compute[231311]: 2025-11-23 21:23:34.599 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:35 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:23:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:23:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:35.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:23:36 np0005532763 nova_compute[231311]: 2025-11-23 21:23:36.112 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:36 np0005532763 podman[253993]: 2025-11-23 21:23:36.222421269 +0000 UTC m=+0.074175284 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 23 16:23:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:23:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:36.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:23:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:23:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:23:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:23:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:23:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:23:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:37.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:23:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:38.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000057s ======
Nov 23 16:23:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:39.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Nov 23 16:23:39 np0005532763 nova_compute[231311]: 2025-11-23 21:23:39.601 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:40 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:23:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:23:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:40.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:23:41 np0005532763 nova_compute[231311]: 2025-11-23 21:23:41.115 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:23:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:41.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:23:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:23:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:23:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:23:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:23:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:42.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:23:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:43.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:23:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:44.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:44 np0005532763 nova_compute[231311]: 2025-11-23 21:23:44.605 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:45 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:23:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:45.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:46 np0005532763 nova_compute[231311]: 2025-11-23 21:23:46.162 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:23:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:46.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:23:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:23:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:23:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:23:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:23:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:47.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:48 np0005532763 podman[254025]: 2025-11-23 21:23:48.228798556 +0000 UTC m=+0.101135878 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 23 16:23:48 np0005532763 podman[254026]: 2025-11-23 21:23:48.278928517 +0000 UTC m=+0.146331879 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true)
Nov 23 16:23:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:48.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:49.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:49 np0005532763 nova_compute[231311]: 2025-11-23 21:23:49.643 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:50 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:23:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:23:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:50.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:23:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:51 np0005532763 nova_compute[231311]: 2025-11-23 21:23:51.198 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:51.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:23:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:23:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:23:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:23:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:23:52.239 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:23:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:23:52.239 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:23:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:23:52.240 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:23:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:52.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:23:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:53.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:23:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:54.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:54 np0005532763 nova_compute[231311]: 2025-11-23 21:23:54.676 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:55 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:23:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:23:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:55.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:23:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:56 np0005532763 nova_compute[231311]: 2025-11-23 21:23:56.200 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:23:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:56.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:23:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:23:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:23:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:23:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:23:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:23:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:57.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:23:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:23:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:23:58.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:23:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:23:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:23:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:23:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:23:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:23:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:23:59.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:23:59 np0005532763 nova_compute[231311]: 2025-11-23 21:23:59.720 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:00 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:24:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:00.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:01 np0005532763 nova_compute[231311]: 2025-11-23 21:24:01.204 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:01 np0005532763 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 16:24:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:01.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:01 np0005532763 rsyslogd[1011]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 23 16:24:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:24:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:24:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:24:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:24:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:02.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:03.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:04 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:24:04 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:24:04 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:24:04 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:24:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:04.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:04 np0005532763 nova_compute[231311]: 2025-11-23 21:24:04.722 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:05 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:24:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:05.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:06 np0005532763 nova_compute[231311]: 2025-11-23 21:24:06.206 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:06.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:24:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:24:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:24:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:24:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:07 np0005532763 podman[254193]: 2025-11-23 21:24:07.218534144 +0000 UTC m=+0.083551709 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 23 16:24:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:07.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:24:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/926478707' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:24:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:24:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/926478707' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:24:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:24:08 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:24:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:24:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:08.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:24:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:24:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:09.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:24:09 np0005532763 nova_compute[231311]: 2025-11-23 21:24:09.724 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:24:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:10.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:11 np0005532763 nova_compute[231311]: 2025-11-23 21:24:11.209 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:11.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:24:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:24:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:24:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:24:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:12.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:13 np0005532763 nova_compute[231311]: 2025-11-23 21:24:13.379 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:24:13 np0005532763 nova_compute[231311]: 2025-11-23 21:24:13.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:24:13 np0005532763 nova_compute[231311]: 2025-11-23 21:24:13.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:24:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:24:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:13.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:24:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:14.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:14 np0005532763 nova_compute[231311]: 2025-11-23 21:24:14.726 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:24:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:15 np0005532763 nova_compute[231311]: 2025-11-23 21:24:15.379 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:24:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:15.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:16 np0005532763 nova_compute[231311]: 2025-11-23 21:24:16.256 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:16.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:24:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:24:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:24:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:24:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:17 np0005532763 nova_compute[231311]: 2025-11-23 21:24:17.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:24:17 np0005532763 nova_compute[231311]: 2025-11-23 21:24:17.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:24:17 np0005532763 nova_compute[231311]: 2025-11-23 21:24:17.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:24:17 np0005532763 nova_compute[231311]: 2025-11-23 21:24:17.399 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:24:17 np0005532763 nova_compute[231311]: 2025-11-23 21:24:17.399 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:24:17 np0005532763 nova_compute[231311]: 2025-11-23 21:24:17.399 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:24:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:17.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:18 np0005532763 nova_compute[231311]: 2025-11-23 21:24:18.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:24:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:18.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:19 np0005532763 podman[254274]: 2025-11-23 21:24:19.229212912 +0000 UTC m=+0.102326052 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:24:19 np0005532763 podman[254275]: 2025-11-23 21:24:19.259044018 +0000 UTC m=+0.126630181 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller)
Nov 23 16:24:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000057s ======
Nov 23 16:24:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:19.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Nov 23 16:24:19 np0005532763 nova_compute[231311]: 2025-11-23 21:24:19.728 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:20 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:24:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:24:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:20.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:24:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:21 np0005532763 nova_compute[231311]: 2025-11-23 21:24:21.259 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:21 np0005532763 nova_compute[231311]: 2025-11-23 21:24:21.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:24:21 np0005532763 nova_compute[231311]: 2025-11-23 21:24:21.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:24:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:21.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:24:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:24:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:24:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:24:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:22 np0005532763 nova_compute[231311]: 2025-11-23 21:24:22.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:24:22 np0005532763 nova_compute[231311]: 2025-11-23 21:24:22.427 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:24:22 np0005532763 nova_compute[231311]: 2025-11-23 21:24:22.427 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:24:22 np0005532763 nova_compute[231311]: 2025-11-23 21:24:22.428 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:24:22 np0005532763 nova_compute[231311]: 2025-11-23 21:24:22.428 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:24:22 np0005532763 nova_compute[231311]: 2025-11-23 21:24:22.429 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:24:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:22.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:22 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:24:22 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2216084231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:24:22 np0005532763 nova_compute[231311]: 2025-11-23 21:24:22.897 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:24:23 np0005532763 nova_compute[231311]: 2025-11-23 21:24:23.143 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:24:23 np0005532763 nova_compute[231311]: 2025-11-23 21:24:23.145 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4803MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:24:23 np0005532763 nova_compute[231311]: 2025-11-23 21:24:23.145 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:24:23 np0005532763 nova_compute[231311]: 2025-11-23 21:24:23.146 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:24:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:23 np0005532763 nova_compute[231311]: 2025-11-23 21:24:23.225 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:24:23 np0005532763 nova_compute[231311]: 2025-11-23 21:24:23.226 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:24:23 np0005532763 nova_compute[231311]: 2025-11-23 21:24:23.255 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:24:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:24:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:23.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:24:23 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:24:23 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3597113063' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:24:23 np0005532763 nova_compute[231311]: 2025-11-23 21:24:23.729 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:24:23 np0005532763 nova_compute[231311]: 2025-11-23 21:24:23.738 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:24:23 np0005532763 nova_compute[231311]: 2025-11-23 21:24:23.757 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:24:23 np0005532763 nova_compute[231311]: 2025-11-23 21:24:23.760 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:24:23 np0005532763 nova_compute[231311]: 2025-11-23 21:24:23.760 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:24:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:24.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:24 np0005532763 nova_compute[231311]: 2025-11-23 21:24:24.730 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:25 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:24:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:25.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:26 np0005532763 nova_compute[231311]: 2025-11-23 21:24:26.300 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:26.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:24:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:24:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:24:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:24:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:24:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:27.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:24:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:24:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:28.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:24:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:24:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:29.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:24:29 np0005532763 nova_compute[231311]: 2025-11-23 21:24:29.732 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:30 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:24:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:30.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:31 np0005532763 nova_compute[231311]: 2025-11-23 21:24:31.302 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:31.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:24:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:24:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:24:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:24:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:32.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:24:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:33.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:24:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:34.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:34 np0005532763 nova_compute[231311]: 2025-11-23 21:24:34.734 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:35 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:24:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:35.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:36 np0005532763 nova_compute[231311]: 2025-11-23 21:24:36.306 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:36.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:24:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:24:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:24:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:24:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:37.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:38 np0005532763 podman[254407]: 2025-11-23 21:24:38.230797866 +0000 UTC m=+0.102206388 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent)
Nov 23 16:24:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:38.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:39.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:39 np0005532763 nova_compute[231311]: 2025-11-23 21:24:39.737 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:40 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:24:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:40.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:41 np0005532763 nova_compute[231311]: 2025-11-23 21:24:41.309 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:41.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:24:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:24:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:24:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:24:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:42.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:43.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:44.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:44 np0005532763 nova_compute[231311]: 2025-11-23 21:24:44.770 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:45 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:24:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:45.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:46 np0005532763 nova_compute[231311]: 2025-11-23 21:24:46.312 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:46.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:24:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:24:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:24:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:24:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:24:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:47.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:24:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:48.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:24:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:49.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:24:49 np0005532763 nova_compute[231311]: 2025-11-23 21:24:49.772 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:50 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:24:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:50 np0005532763 podman[254463]: 2025-11-23 21:24:50.199354429 +0000 UTC m=+0.080984547 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 16:24:50 np0005532763 podman[254464]: 2025-11-23 21:24:50.242716728 +0000 UTC m=+0.125908030 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 23 16:24:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:50.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:51 np0005532763 nova_compute[231311]: 2025-11-23 21:24:51.314 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:51.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:24:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:24:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:24:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:24:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:24:52.241 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:24:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:24:52.241 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:24:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:24:52.241 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:24:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:52.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:53.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:54.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:54 np0005532763 nova_compute[231311]: 2025-11-23 21:24:54.774 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:55 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:24:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:55.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:56 np0005532763 nova_compute[231311]: 2025-11-23 21:24:56.316 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:24:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:24:56.688295) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763933096688366, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 1643, "num_deletes": 251, "total_data_size": 4104514, "memory_usage": 4152096, "flush_reason": "Manual Compaction"}
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763933096706106, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 2679315, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38017, "largest_seqno": 39655, "table_properties": {"data_size": 2672517, "index_size": 3933, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14367, "raw_average_key_size": 20, "raw_value_size": 2658824, "raw_average_value_size": 3713, "num_data_blocks": 171, "num_entries": 716, "num_filter_entries": 716, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763932950, "oldest_key_time": 1763932950, "file_creation_time": 1763933096, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 17866 microseconds, and 7971 cpu microseconds.
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:24:56.706174) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 2679315 bytes OK
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:24:56.706204) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:24:56.709844) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:24:56.709865) EVENT_LOG_v1 {"time_micros": 1763933096709859, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:24:56.709891) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 4097094, prev total WAL file size 4097094, number of live WAL files 2.
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:24:56.711044) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(2616KB)], [72(11MB)]
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763933096711151, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 15136494, "oldest_snapshot_seqno": -1}
Nov 23 16:24:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:56.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6708 keys, 12987028 bytes, temperature: kUnknown
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763933096786700, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 12987028, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12944992, "index_size": 24132, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16837, "raw_key_size": 176364, "raw_average_key_size": 26, "raw_value_size": 12826896, "raw_average_value_size": 1912, "num_data_blocks": 946, "num_entries": 6708, "num_filter_entries": 6708, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763933096, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:24:56.787016) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 12987028 bytes
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:24:56.791145) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 200.1 rd, 171.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 11.9 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(10.5) write-amplify(4.8) OK, records in: 7224, records dropped: 516 output_compression: NoCompression
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:24:56.791313) EVENT_LOG_v1 {"time_micros": 1763933096791255, "job": 44, "event": "compaction_finished", "compaction_time_micros": 75628, "compaction_time_cpu_micros": 47733, "output_level": 6, "num_output_files": 1, "total_output_size": 12987028, "num_input_records": 7224, "num_output_records": 6708, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763933096792407, "job": 44, "event": "table_file_deletion", "file_number": 74}
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763933096796861, "job": 44, "event": "table_file_deletion", "file_number": 72}
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:24:56.710872) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:24:56.797004) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:24:56.797012) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:24:56.797013) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:24:56.797015) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:24:56 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:24:56.797017) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:24:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:24:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:24:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:24:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:24:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:24:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:24:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:57.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:24:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:24:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:24:58.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:24:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:24:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:24:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:24:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:24:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000057s ======
Nov 23 16:24:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:24:59.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Nov 23 16:24:59 np0005532763 nova_compute[231311]: 2025-11-23 21:24:59.776 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:00 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:25:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:00.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:01 np0005532763 nova_compute[231311]: 2025-11-23 21:25:01.319 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:01.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:25:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:25:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:25:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:25:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:25:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:02.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:25:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:25:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:03.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:25:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:04.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:04 np0005532763 nova_compute[231311]: 2025-11-23 21:25:04.779 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:05 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:25:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:05.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:06 np0005532763 nova_compute[231311]: 2025-11-23 21:25:06.363 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:06.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:25:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:25:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:25:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:25:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:25:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:07.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:25:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:25:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4110331943' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:25:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:25:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4110331943' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:25:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:25:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:08.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:25:08 np0005532763 podman[254551]: 2025-11-23 21:25:08.907607401 +0000 UTC m=+0.083154818 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 16:25:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:09 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Nov 23 16:25:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:09.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:09 np0005532763 nova_compute[231311]: 2025-11-23 21:25:09.782 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:25:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:25:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:25:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:25:10 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:25:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:10.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:11 np0005532763 nova_compute[231311]: 2025-11-23 21:25:11.366 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:11.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:25:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:25:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:25:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:25:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:12.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:13.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:14.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:14 np0005532763 nova_compute[231311]: 2025-11-23 21:25:14.803 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:25:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:15.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:16 np0005532763 nova_compute[231311]: 2025-11-23 21:25:16.369 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:16 np0005532763 nova_compute[231311]: 2025-11-23 21:25:16.756 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:25:16 np0005532763 nova_compute[231311]: 2025-11-23 21:25:16.757 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:25:16 np0005532763 nova_compute[231311]: 2025-11-23 21:25:16.757 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:25:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:25:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:16.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:25:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:25:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:25:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:25:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:25:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:17 np0005532763 nova_compute[231311]: 2025-11-23 21:25:17.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:25:17 np0005532763 nova_compute[231311]: 2025-11-23 21:25:17.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:25:17 np0005532763 nova_compute[231311]: 2025-11-23 21:25:17.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:25:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:17 np0005532763 nova_compute[231311]: 2025-11-23 21:25:17.395 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:25:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:17.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:17 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:25:17 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:25:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:18 np0005532763 nova_compute[231311]: 2025-11-23 21:25:18.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:25:18 np0005532763 nova_compute[231311]: 2025-11-23 21:25:18.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:25:18 np0005532763 nova_compute[231311]: 2025-11-23 21:25:18.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:25:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:18.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:19.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:19 np0005532763 nova_compute[231311]: 2025-11-23 21:25:19.839 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:20 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:25:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:25:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:20.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:25:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:21 np0005532763 podman[254687]: 2025-11-23 21:25:21.234953886 +0000 UTC m=+0.106701856 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 16:25:21 np0005532763 podman[254688]: 2025-11-23 21:25:21.284694406 +0000 UTC m=+0.152687199 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:25:21 np0005532763 nova_compute[231311]: 2025-11-23 21:25:21.371 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:21.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:25:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:25:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:25:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:25:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:22 np0005532763 nova_compute[231311]: 2025-11-23 21:25:22.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:25:22 np0005532763 nova_compute[231311]: 2025-11-23 21:25:22.384 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:25:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:25:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:22.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:25:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:23 np0005532763 nova_compute[231311]: 2025-11-23 21:25:23.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:25:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:23 np0005532763 nova_compute[231311]: 2025-11-23 21:25:23.412 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:25:23 np0005532763 nova_compute[231311]: 2025-11-23 21:25:23.412 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:25:23 np0005532763 nova_compute[231311]: 2025-11-23 21:25:23.412 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:25:23 np0005532763 nova_compute[231311]: 2025-11-23 21:25:23.413 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:25:23 np0005532763 nova_compute[231311]: 2025-11-23 21:25:23.413 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:25:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:23.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:23 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:25:23 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/744591241' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:25:23 np0005532763 nova_compute[231311]: 2025-11-23 21:25:23.900 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:25:24 np0005532763 nova_compute[231311]: 2025-11-23 21:25:24.109 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:25:24 np0005532763 nova_compute[231311]: 2025-11-23 21:25:24.109 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4810MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:25:24 np0005532763 nova_compute[231311]: 2025-11-23 21:25:24.110 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:25:24 np0005532763 nova_compute[231311]: 2025-11-23 21:25:24.110 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:25:24 np0005532763 nova_compute[231311]: 2025-11-23 21:25:24.172 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:25:24 np0005532763 nova_compute[231311]: 2025-11-23 21:25:24.173 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:25:24 np0005532763 nova_compute[231311]: 2025-11-23 21:25:24.184 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:25:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:25:24 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3439106909' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:25:24 np0005532763 nova_compute[231311]: 2025-11-23 21:25:24.676 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:25:24 np0005532763 nova_compute[231311]: 2025-11-23 21:25:24.683 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:25:24 np0005532763 nova_compute[231311]: 2025-11-23 21:25:24.701 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:25:24 np0005532763 nova_compute[231311]: 2025-11-23 21:25:24.703 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:25:24 np0005532763 nova_compute[231311]: 2025-11-23 21:25:24.703 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.593s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:25:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:24.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:24 np0005532763 nova_compute[231311]: 2025-11-23 21:25:24.871 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:25 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:25:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:25.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:26 np0005532763 nova_compute[231311]: 2025-11-23 21:25:26.373 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:26.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:25:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:25:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:25:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:25:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:25:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:27.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:25:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:28.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:25:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:29.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:25:29 np0005532763 nova_compute[231311]: 2025-11-23 21:25:29.873 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:30 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:25:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:25:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:30.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:25:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:31 np0005532763 nova_compute[231311]: 2025-11-23 21:25:31.415 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:31.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:25:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:25:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:25:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:25:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:25:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:32.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:25:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:33.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:34.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:34 np0005532763 nova_compute[231311]: 2025-11-23 21:25:34.875 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:35 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:25:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:35.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:36 np0005532763 nova_compute[231311]: 2025-11-23 21:25:36.448 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:36.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:25:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:25:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:25:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:25:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:25:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:37.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:25:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:38.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:39 np0005532763 podman[254821]: 2025-11-23 21:25:39.202438615 +0000 UTC m=+0.082372556 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 23 16:25:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:39.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:39 np0005532763 nova_compute[231311]: 2025-11-23 21:25:39.878 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:40 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:25:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:40.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:41 np0005532763 nova_compute[231311]: 2025-11-23 21:25:41.504 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:25:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:41.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:25:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:25:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:25:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:25:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:25:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:25:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:42.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:25:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:25:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:43.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:25:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:44.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:44 np0005532763 nova_compute[231311]: 2025-11-23 21:25:44.880 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:45 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:25:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:45.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:46 np0005532763 nova_compute[231311]: 2025-11-23 21:25:46.545 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:25:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:46.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:25:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:25:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:25:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:25:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:25:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:25:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:47.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:25:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:48.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:49.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:49 np0005532763 nova_compute[231311]: 2025-11-23 21:25:49.883 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:50 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:25:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:50.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:51 np0005532763 nova_compute[231311]: 2025-11-23 21:25:51.597 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:51.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:25:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:25:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:25:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:25:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:52 np0005532763 podman[254878]: 2025-11-23 21:25:52.218725657 +0000 UTC m=+0.092324009 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 23 16:25:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:25:52.241 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:25:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:25:52.242 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:25:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:25:52.242 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:25:52 np0005532763 podman[254879]: 2025-11-23 21:25:52.275544157 +0000 UTC m=+0.141690667 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 23 16:25:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:52.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:53.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:54.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:54 np0005532763 nova_compute[231311]: 2025-11-23 21:25:54.887 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:55 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:25:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:25:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:55.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:25:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:56 np0005532763 nova_compute[231311]: 2025-11-23 21:25:56.629 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:25:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:56.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:25:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:25:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:25:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:25:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:25:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:57.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:25:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:25:58.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:25:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:25:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:25:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:25:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:25:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:25:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:25:59.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:25:59 np0005532763 nova_compute[231311]: 2025-11-23 21:25:59.889 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:00 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:26:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:00.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:01 np0005532763 nova_compute[231311]: 2025-11-23 21:26:01.632 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:01.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:26:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:26:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:26:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:26:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:02.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:03.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:04.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:04 np0005532763 nova_compute[231311]: 2025-11-23 21:26:04.893 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:05 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:26:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:05.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:06 np0005532763 nova_compute[231311]: 2025-11-23 21:26:06.635 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:06.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:26:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:26:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:26:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:26:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:07.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:26:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/579471618' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:26:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:26:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/579471618' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:26:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:08.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:09.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:09 np0005532763 nova_compute[231311]: 2025-11-23 21:26:09.894 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:26:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:10 np0005532763 podman[254942]: 2025-11-23 21:26:10.211096792 +0000 UTC m=+0.090255779 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 16:26:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:10.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:11 np0005532763 nova_compute[231311]: 2025-11-23 21:26:11.638 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:11.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:26:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:26:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:26:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:26:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:12.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:13.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:14.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:14 np0005532763 nova_compute[231311]: 2025-11-23 21:26:14.898 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:26:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:15.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:16 np0005532763 nova_compute[231311]: 2025-11-23 21:26:16.642 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:16 np0005532763 nova_compute[231311]: 2025-11-23 21:26:16.704 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:26:16 np0005532763 nova_compute[231311]: 2025-11-23 21:26:16.705 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:26:16 np0005532763 nova_compute[231311]: 2025-11-23 21:26:16.705 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:26:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:16.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:26:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:26:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:26:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:26:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:17.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:18 np0005532763 podman[255118]: 2025-11-23 21:26:18.014129122 +0000 UTC m=+0.082877681 container exec 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Nov 23 16:26:18 np0005532763 podman[255118]: 2025-11-23 21:26:18.132608801 +0000 UTC m=+0.201357330 container exec_died 3d9e8671bf7046be20926eab0658c5982e3ccc6c2fb2d9813d3627465564107f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Nov 23 16:26:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:18 np0005532763 podman[255237]: 2025-11-23 21:26:18.765898992 +0000 UTC m=+0.088099008 container exec bfa89024a4f3a8c3745fbdf8141ab9c1af6ff603988de647c9e7f7e15dff8638 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 16:26:18 np0005532763 podman[255237]: 2025-11-23 21:26:18.780750553 +0000 UTC m=+0.102950519 container exec_died bfa89024a4f3a8c3745fbdf8141ab9c1af6ff603988de647c9e7f7e15dff8638 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 23 16:26:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:18.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:19 np0005532763 podman[255329]: 2025-11-23 21:26:19.28861632 +0000 UTC m=+0.084013552 container exec 10ce05665482e9899a7eee0ab4547bdd9a9d872d3217d9554617c432a64e912a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0)
Nov 23 16:26:19 np0005532763 podman[255329]: 2025-11-23 21:26:19.304340866 +0000 UTC m=+0.099738098 container exec_died 10ce05665482e9899a7eee0ab4547bdd9a9d872d3217d9554617c432a64e912a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS)
Nov 23 16:26:19 np0005532763 nova_compute[231311]: 2025-11-23 21:26:19.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:26:19 np0005532763 nova_compute[231311]: 2025-11-23 21:26:19.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:26:19 np0005532763 nova_compute[231311]: 2025-11-23 21:26:19.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:26:19 np0005532763 nova_compute[231311]: 2025-11-23 21:26:19.396 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:26:19 np0005532763 nova_compute[231311]: 2025-11-23 21:26:19.397 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:26:19 np0005532763 nova_compute[231311]: 2025-11-23 21:26:19.397 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:26:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:19 np0005532763 podman[255393]: 2025-11-23 21:26:19.583257043 +0000 UTC m=+0.068622387 container exec 187afc4c1e67339be091cc4caff41c0e2aaba4673fc086f757180d516596ee6c (image=quay.io/ceph/haproxy:2.3, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem)
Nov 23 16:26:19 np0005532763 podman[255393]: 2025-11-23 21:26:19.625653775 +0000 UTC m=+0.111019119 container exec_died 187afc4c1e67339be091cc4caff41c0e2aaba4673fc086f757180d516596ee6c (image=quay.io/ceph/haproxy:2.3, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-haproxy-nfs-cephfs-compute-2-dxqoem)
Nov 23 16:26:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:19.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:19 np0005532763 nova_compute[231311]: 2025-11-23 21:26:19.898 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:19 np0005532763 podman[255463]: 2025-11-23 21:26:19.945160972 +0000 UTC m=+0.075249144 container exec f83166e24f35928d8e85c6352ec69e598c685dd22eb2d34bc93aec691f658844 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, distribution-scope=public, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git)
Nov 23 16:26:19 np0005532763 podman[255463]: 2025-11-23 21:26:19.963701078 +0000 UTC m=+0.093789240 container exec_died f83166e24f35928d8e85c6352ec69e598c685dd22eb2d34bc93aec691f658844 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt, vcs-type=git, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.buildah.version=1.28.2, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, release=1793, distribution-scope=public, description=keepalived for Ceph, name=keepalived)
Nov 23 16:26:20 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:26:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:20 np0005532763 nova_compute[231311]: 2025-11-23 21:26:20.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:26:20 np0005532763 nova_compute[231311]: 2025-11-23 21:26:20.398 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:26:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:20 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:26:20 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:26:20 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Nov 23 16:26:20 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:26:20 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:26:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:20.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:21 np0005532763 nova_compute[231311]: 2025-11-23 21:26:21.644 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:21 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Nov 23 16:26:21 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:26:21 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:26:21 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:26:21 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:26:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:21.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:26:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:26:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:26:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:26:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:22 np0005532763 nova_compute[231311]: 2025-11-23 21:26:22.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:26:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:22.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:23 np0005532763 podman[255613]: 2025-11-23 21:26:23.237832033 +0000 UTC m=+0.108074115 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 23 16:26:23 np0005532763 podman[255614]: 2025-11-23 21:26:23.279303878 +0000 UTC m=+0.149547850 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 23 16:26:23 np0005532763 nova_compute[231311]: 2025-11-23 21:26:23.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:26:23 np0005532763 nova_compute[231311]: 2025-11-23 21:26:23.405 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:26:23 np0005532763 nova_compute[231311]: 2025-11-23 21:26:23.405 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:26:23 np0005532763 nova_compute[231311]: 2025-11-23 21:26:23.406 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:26:23 np0005532763 nova_compute[231311]: 2025-11-23 21:26:23.406 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:26:23 np0005532763 nova_compute[231311]: 2025-11-23 21:26:23.406 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:26:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:23.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:23 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:26:23 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2246533849' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:26:23 np0005532763 nova_compute[231311]: 2025-11-23 21:26:23.932 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:26:24 np0005532763 nova_compute[231311]: 2025-11-23 21:26:24.175 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:26:24 np0005532763 nova_compute[231311]: 2025-11-23 21:26:24.177 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4824MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:26:24 np0005532763 nova_compute[231311]: 2025-11-23 21:26:24.177 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:26:24 np0005532763 nova_compute[231311]: 2025-11-23 21:26:24.177 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:26:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:24 np0005532763 nova_compute[231311]: 2025-11-23 21:26:24.242 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:26:24 np0005532763 nova_compute[231311]: 2025-11-23 21:26:24.242 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:26:24 np0005532763 nova_compute[231311]: 2025-11-23 21:26:24.267 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:26:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:24 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:26:24 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2088859336' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:26:24 np0005532763 nova_compute[231311]: 2025-11-23 21:26:24.760 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:26:24 np0005532763 nova_compute[231311]: 2025-11-23 21:26:24.770 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:26:24 np0005532763 nova_compute[231311]: 2025-11-23 21:26:24.787 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:26:24 np0005532763 nova_compute[231311]: 2025-11-23 21:26:24.790 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:26:24 np0005532763 nova_compute[231311]: 2025-11-23 21:26:24.790 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:26:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:26:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:24.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:26:24 np0005532763 nova_compute[231311]: 2025-11-23 21:26:24.900 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:25 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:26:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:25.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:25 np0005532763 nova_compute[231311]: 2025-11-23 21:26:25.792 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:26:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:26 np0005532763 nova_compute[231311]: 2025-11-23 21:26:26.692 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:26 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:26:26 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:26:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:26.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:26:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:26:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:26:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:26:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:27.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:28.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:29.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:29 np0005532763 nova_compute[231311]: 2025-11-23 21:26:29.904 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:30 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:26:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:30.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:31 np0005532763 nova_compute[231311]: 2025-11-23 21:26:31.693 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:31.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:26:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:26:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:31 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:26:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:26:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:32 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:32 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:32 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:32.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:26:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:33.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:26:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:34 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:34 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:34 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:34.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:34 np0005532763 nova_compute[231311]: 2025-11-23 21:26:34.903 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:35 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:26:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:26:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:35.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:26:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:36 np0005532763 nova_compute[231311]: 2025-11-23 21:26:36.726 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:36 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:36 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:36 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:36.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:26:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:26:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:26:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:26:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000056s ======
Nov 23 16:26:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:37.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000056s
Nov 23 16:26:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:38 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:38 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:38 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:38.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:39.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:39 np0005532763 nova_compute[231311]: 2025-11-23 21:26:39.907 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:40 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:26:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:40 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:40 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:26:40 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:40.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:26:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:41 np0005532763 podman[255771]: 2025-11-23 21:26:41.213611558 +0000 UTC m=+0.088867820 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 23 16:26:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:41 np0005532763 nova_compute[231311]: 2025-11-23 21:26:41.770 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:41.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:26:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:26:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:26:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:26:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:42 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:42 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:42 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:42.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:26:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:43.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:26:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:44 np0005532763 nova_compute[231311]: 2025-11-23 21:26:44.909 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:44 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:44 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:44 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:44.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:45 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:26:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000057s ======
Nov 23 16:26:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:45.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Nov 23 16:26:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:46 np0005532763 nova_compute[231311]: 2025-11-23 21:26:46.807 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:46 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:46 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:46 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:46.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:26:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:26:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:46 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:26:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:26:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:47.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:48 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:48 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:48 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:48.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:26:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:49.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:26:49 np0005532763 nova_compute[231311]: 2025-11-23 21:26:49.910 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:50 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:26:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:50 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:50 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:26:50 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:50.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:26:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:51.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:51 np0005532763 nova_compute[231311]: 2025-11-23 21:26:51.840 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:26:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:26:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:26:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:26:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:26:52.243 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:26:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:26:52.243 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:26:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:26:52.243 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:26:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:52 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:52 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:52 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:52.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:53.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:54 np0005532763 podman[255828]: 2025-11-23 21:26:54.205695168 +0000 UTC m=+0.085789853 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd)
Nov 23 16:26:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:54 np0005532763 podman[255829]: 2025-11-23 21:26:54.264144795 +0000 UTC m=+0.145111355 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 23 16:26:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:54 np0005532763 nova_compute[231311]: 2025-11-23 21:26:54.913 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:54 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:54 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:54 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:54.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:55 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:26:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:26:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:55.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:26:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:56 np0005532763 nova_compute[231311]: 2025-11-23 21:26:56.842 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:26:56 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:56 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:26:56 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:56.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:26:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:26:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:26:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:26:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:26:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:26:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:57.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:58 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:58 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:26:58 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:26:58.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:26:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:26:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:26:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:26:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:26:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:26:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:26:59.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:26:59 np0005532763 nova_compute[231311]: 2025-11-23 21:26:59.915 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:00 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:27:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:00 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:00 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:00 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:00.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:01.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:01 np0005532763 nova_compute[231311]: 2025-11-23 21:27:01.845 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:27:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:27:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:27:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:27:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:02 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:02 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:02 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:02.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:27:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:03.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:27:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:04 np0005532763 nova_compute[231311]: 2025-11-23 21:27:04.916 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:04 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:04 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:04 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:04.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:05 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:27:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:05.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:06 np0005532763 nova_compute[231311]: 2025-11-23 21:27:06.848 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:06 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:06 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:27:06 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:06.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:27:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:27:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:27:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:27:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:27:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:07.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:27:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/801547798' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:27:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:27:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/801547798' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:27:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:08 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:08 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:27:08 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:08.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:27:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:27:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:09.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:27:09 np0005532763 nova_compute[231311]: 2025-11-23 21:27:09.919 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:27:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:10 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:10 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:27:10 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:10.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:27:11.112335) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763933231112379, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1600, "num_deletes": 255, "total_data_size": 4028045, "memory_usage": 4079408, "flush_reason": "Manual Compaction"}
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763933231128483, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 2630519, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39661, "largest_seqno": 41255, "table_properties": {"data_size": 2623758, "index_size": 3896, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14204, "raw_average_key_size": 19, "raw_value_size": 2610121, "raw_average_value_size": 3660, "num_data_blocks": 168, "num_entries": 713, "num_filter_entries": 713, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763933096, "oldest_key_time": 1763933096, "file_creation_time": 1763933231, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 16208 microseconds, and 9680 cpu microseconds.
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:27:11.128540) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 2630519 bytes OK
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:27:11.128565) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:27:11.130509) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:27:11.130531) EVENT_LOG_v1 {"time_micros": 1763933231130524, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:27:11.130553) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 4020730, prev total WAL file size 4020730, number of live WAL files 2.
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:27:11.132092) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303034' seq:72057594037927935, type:22 .. '6C6F676D0031323535' seq:0, type:0; will stop at (end)
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(2568KB)], [75(12MB)]
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763933231132136, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 15617547, "oldest_snapshot_seqno": -1}
Nov 23 16:27:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6893 keys, 15452571 bytes, temperature: kUnknown
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763933231221578, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 15452571, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15406789, "index_size": 27430, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17285, "raw_key_size": 181157, "raw_average_key_size": 26, "raw_value_size": 15282866, "raw_average_value_size": 2217, "num_data_blocks": 1083, "num_entries": 6893, "num_filter_entries": 6893, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1763930464, "oldest_key_time": 0, "file_creation_time": 1763933231, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "2623c212-36b1-4df9-b695-1a7be3fdfc0c", "db_session_id": "9X7YYXRZ70MLDLQBPDMX", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:27:11.222009) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 15452571 bytes
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:27:11.223569) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 174.3 rd, 172.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.5, 12.4 +0.0 blob) out(14.7 +0.0 blob), read-write-amplify(11.8) write-amplify(5.9) OK, records in: 7421, records dropped: 528 output_compression: NoCompression
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:27:11.223607) EVENT_LOG_v1 {"time_micros": 1763933231223591, "job": 46, "event": "compaction_finished", "compaction_time_micros": 89619, "compaction_time_cpu_micros": 52100, "output_level": 6, "num_output_files": 1, "total_output_size": 15452571, "num_input_records": 7421, "num_output_records": 6893, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763933231224770, "job": 46, "event": "table_file_deletion", "file_number": 77}
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: EVENT_LOG_v1 {"time_micros": 1763933231229342, "job": 46, "event": "table_file_deletion", "file_number": 75}
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:27:11.131967) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:27:11.229508) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:27:11.229516) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:27:11.229519) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:27:11.229522) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:27:11 np0005532763 ceph-mon[75752]: rocksdb: (Original Log Time 2025/11/23-21:27:11.229525) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Nov 23 16:27:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:11.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:11 np0005532763 nova_compute[231311]: 2025-11-23 21:27:11.851 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:27:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:27:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:27:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:27:12 np0005532763 podman[255916]: 2025-11-23 21:27:12.207622522 +0000 UTC m=+0.085255428 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 23 16:27:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:12 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:12 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:27:12 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:12.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:27:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:13.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:14 np0005532763 nova_compute[231311]: 2025-11-23 21:27:14.921 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:14 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:14 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:14 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:14.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:27:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:27:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:15.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:27:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:16 np0005532763 nova_compute[231311]: 2025-11-23 21:27:16.378 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:27:16 np0005532763 nova_compute[231311]: 2025-11-23 21:27:16.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:27:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:16 np0005532763 nova_compute[231311]: 2025-11-23 21:27:16.888 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:16 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:16 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:16 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:16.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:27:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:27:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:27:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:27:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:17 np0005532763 nova_compute[231311]: 2025-11-23 21:27:17.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:27:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:17.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:18 np0005532763 nova_compute[231311]: 2025-11-23 21:27:18.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:27:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:18 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:18 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:27:18 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:18.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:27:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:19 np0005532763 nova_compute[231311]: 2025-11-23 21:27:19.397 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:27:19 np0005532763 nova_compute[231311]: 2025-11-23 21:27:19.398 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:27:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:19.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:19 np0005532763 nova_compute[231311]: 2025-11-23 21:27:19.923 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:20 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:27:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:20 np0005532763 nova_compute[231311]: 2025-11-23 21:27:20.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:27:20 np0005532763 nova_compute[231311]: 2025-11-23 21:27:20.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:27:20 np0005532763 nova_compute[231311]: 2025-11-23 21:27:20.384 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:27:20 np0005532763 nova_compute[231311]: 2025-11-23 21:27:20.409 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:27:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:20 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:20 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:20 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:20.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:27:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:21.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:27:21 np0005532763 nova_compute[231311]: 2025-11-23 21:27:21.923 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:27:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:27:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:21 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:27:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:22 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:27:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:22 np0005532763 nova_compute[231311]: 2025-11-23 21:27:22.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:27:22 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:22 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:22 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:22 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:22 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:22.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:23 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:23 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:23 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:23 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:23 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:23.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:24 np0005532763 nova_compute[231311]: 2025-11-23 21:27:24.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:27:24 np0005532763 nova_compute[231311]: 2025-11-23 21:27:24.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:27:24 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:24 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:24 np0005532763 nova_compute[231311]: 2025-11-23 21:27:24.925 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:24 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:24 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:27:24 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:24.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:27:25 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:27:25 np0005532763 podman[255948]: 2025-11-23 21:27:25.18492482 +0000 UTC m=+0.072310201 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 16:27:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:25 np0005532763 podman[255949]: 2025-11-23 21:27:25.253258477 +0000 UTC m=+0.125521009 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 23 16:27:25 np0005532763 nova_compute[231311]: 2025-11-23 21:27:25.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:27:25 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:25 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:25 np0005532763 nova_compute[231311]: 2025-11-23 21:27:25.419 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:27:25 np0005532763 nova_compute[231311]: 2025-11-23 21:27:25.420 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:27:25 np0005532763 nova_compute[231311]: 2025-11-23 21:27:25.421 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:27:25 np0005532763 nova_compute[231311]: 2025-11-23 21:27:25.421 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 23 16:27:25 np0005532763 nova_compute[231311]: 2025-11-23 21:27:25.422 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:27:25 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:25 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:25 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:25.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:25 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:27:25 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1904251977' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:27:25 np0005532763 nova_compute[231311]: 2025-11-23 21:27:25.891 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:27:26 np0005532763 nova_compute[231311]: 2025-11-23 21:27:26.133 231315 WARNING nova.virt.libvirt.driver [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 23 16:27:26 np0005532763 nova_compute[231311]: 2025-11-23 21:27:26.135 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4810MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 23 16:27:26 np0005532763 nova_compute[231311]: 2025-11-23 21:27:26.135 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:27:26 np0005532763 nova_compute[231311]: 2025-11-23 21:27:26.136 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:27:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:26 np0005532763 nova_compute[231311]: 2025-11-23 21:27:26.317 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 23 16:27:26 np0005532763 nova_compute[231311]: 2025-11-23 21:27:26.318 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 23 16:27:26 np0005532763 nova_compute[231311]: 2025-11-23 21:27:26.400 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 23 16:27:26 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:26 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:26 np0005532763 nova_compute[231311]: 2025-11-23 21:27:26.978 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:26 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:26 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:27:26 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:26.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:27:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:27:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:27:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:26 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:27:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:27 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:27:27 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Nov 23 16:27:27 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/882900723' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Nov 23 16:27:27 np0005532763 nova_compute[231311]: 2025-11-23 21:27:27.028 231315 DEBUG oslo_concurrency.processutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.628s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 23 16:27:27 np0005532763 nova_compute[231311]: 2025-11-23 21:27:27.033 231315 DEBUG nova.compute.provider_tree [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed in ProviderTree for provider: 20c32e0a-de2c-427c-9273-fac11e2660f4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 23 16:27:27 np0005532763 nova_compute[231311]: 2025-11-23 21:27:27.053 231315 DEBUG nova.scheduler.client.report [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Inventory has not changed for provider 20c32e0a-de2c-427c-9273-fac11e2660f4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 23 16:27:27 np0005532763 nova_compute[231311]: 2025-11-23 21:27:27.054 231315 DEBUG nova.compute.resource_tracker [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 23 16:27:27 np0005532763 nova_compute[231311]: 2025-11-23 21:27:27.055 231315 DEBUG oslo_concurrency.lockutils [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.919s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:27:27 np0005532763 nova_compute[231311]: 2025-11-23 21:27:27.055 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:27:27 np0005532763 nova_compute[231311]: 2025-11-23 21:27:27.056 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 23 16:27:27 np0005532763 nova_compute[231311]: 2025-11-23 21:27:27.078 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 23 16:27:27 np0005532763 nova_compute[231311]: 2025-11-23 21:27:27.078 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:27:27 np0005532763 nova_compute[231311]: 2025-11-23 21:27:27.079 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 23 16:27:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:27 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:27 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:27 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:27 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:27 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:27.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:28 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:28 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:28 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:28 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:28 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:28.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:29 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:29 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:29 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:29 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:27:29 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:29.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:27:29 np0005532763 nova_compute[231311]: 2025-11-23 21:27:29.928 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:30 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:27:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:30 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:30 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:30 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:27:30 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:27:30 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:30 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:30 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:30.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:31 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:31 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:31 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Nov 23 16:27:31 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:27:31 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:27:31 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Nov 23 16:27:31 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:31 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.002000057s ======
Nov 23 16:27:31 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:31.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000057s
Nov 23 16:27:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:27:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:27:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:27:32 np0005532763 nova_compute[231311]: 2025-11-23 21:27:32.014 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:32 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:27:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:32 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:32 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:33.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:33 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:33 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:33 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:33 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:33 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:33.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:34 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:34 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:34 np0005532763 nova_compute[231311]: 2025-11-23 21:27:34.930 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:35.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:35 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:27:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:35 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:35 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:35 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:35 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:27:35 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:35.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:27:35 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:27:35 np0005532763 ceph-mon[75752]: from='mgr.14688 192.168.122.100:0/520882446' entity='mgr.compute-0.oyehye' 
Nov 23 16:27:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:36 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:36 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:27:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:27:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:36 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:27:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:37 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:27:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:37.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:37 np0005532763 nova_compute[231311]: 2025-11-23 21:27:37.057 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:37 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:37 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:37 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:37 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:37 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:37.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:38 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:38 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:39.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:39 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:39 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:39 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:39 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:39 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:39.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:39 np0005532763 nova_compute[231311]: 2025-11-23 21:27:39.954 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:40 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:27:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:40 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:40 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:27:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:41.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:27:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:41 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:41 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:41 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:41 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:27:41 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:41.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:27:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:27:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:27:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:41 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:27:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:42 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:27:42 np0005532763 nova_compute[231311]: 2025-11-23 21:27:42.061 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:42 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:42 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:43.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:43 np0005532763 podman[256189]: 2025-11-23 21:27:43.218457891 +0000 UTC m=+0.092399891 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 23 16:27:43 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:43 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:43 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:43 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:27:43 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:43.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:27:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:44 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:44 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:44 np0005532763 nova_compute[231311]: 2025-11-23 21:27:44.956 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:45.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:45 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:27:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:45 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:45 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:45 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:45 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:45 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:45.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:46 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:46 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:27:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:27:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:27:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:47 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:27:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:27:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:47.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:27:47 np0005532763 nova_compute[231311]: 2025-11-23 21:27:47.106 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:47 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:47 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:47 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:47 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:27:47 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:47.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:27:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:48 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:48 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:27:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:49.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:27:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:49 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:49 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:49 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:49 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:27:49 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:49.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:27:49 np0005532763 nova_compute[231311]: 2025-11-23 21:27:49.957 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:50 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:27:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:50 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:50 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:27:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:51.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:27:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:51 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:51 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:51 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:51 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:51 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:51.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:27:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:51 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:27:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:27:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:52 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:27:52 np0005532763 nova_compute[231311]: 2025-11-23 21:27:52.146 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:27:52.243 142920 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 23 16:27:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:27:52.244 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 23 16:27:52 np0005532763 ovn_metadata_agent[142915]: 2025-11-23 21:27:52.244 142920 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 23 16:27:52 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:52 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:53.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:53 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:53 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:53 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:53 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:53 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:53.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:54 np0005532763 systemd-logind[830]: New session 58 of user zuul.
Nov 23 16:27:54 np0005532763 systemd[1]: Started Session 58 of User zuul.
Nov 23 16:27:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:54 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:54 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:54 np0005532763 nova_compute[231311]: 2025-11-23 21:27:54.959 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:55.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:55 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:27:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:55 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:55 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:55 np0005532763 podman[256285]: 2025-11-23 21:27:55.529498793 +0000 UTC m=+0.106612803 container health_status 43a33544bf04dca00c0ccb6961a292d460a658c0566e9a0ef748af99c21e0152 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 23 16:27:55 np0005532763 podman[256286]: 2025-11-23 21:27:55.570169016 +0000 UTC m=+0.143935921 container health_status 75fb00be3d865cda0c41df82265637be878a02830152cced7e2593751faaa745 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 23 16:27:55 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:55 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:27:55 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:55.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:27:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:56 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:56 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:27:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:27:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:56 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:27:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:27:57 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:27:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:27:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:57.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:27:57 np0005532763 nova_compute[231311]: 2025-11-23 21:27:57.182 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:27:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:57 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:57 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:57 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:57 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:27:57 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:57.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:27:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:58 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Nov 23 16:27:58 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2562450027' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Nov 23 16:27:58 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:58 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:27:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:27:59.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:27:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:27:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:59 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:27:59 2025: (VI_0) received an invalid passwd!
Nov 23 16:27:59 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:27:59 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:27:59 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:27:59.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:27:59 np0005532763 nova_compute[231311]: 2025-11-23 21:27:59.961 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:28:00 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:28:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:00 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:00 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:28:01.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:01 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:01 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:01 np0005532763 ovs-vsctl[256635]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 23 16:28:01 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:01 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:28:01 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:28:01.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:28:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:28:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:28:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:28:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:28:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:28:01 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:28:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:28:02 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:28:02 np0005532763 nova_compute[231311]: 2025-11-23 21:28:02.215 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:28:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:02 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:02 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:02 np0005532763 virtqemud[230850]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 23 16:28:02 np0005532763 virtqemud[230850]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 23 16:28:02 np0005532763 virtqemud[230850]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 23 16:28:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:28:03.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:03 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:03 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:03 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: cache status {prefix=cache status} (starting...)
Nov 23 16:28:03 np0005532763 lvm[256969]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Nov 23 16:28:03 np0005532763 lvm[256969]: VG ceph_vg0 finished
Nov 23 16:28:03 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: client ls {prefix=client ls} (starting...)
Nov 23 16:28:03 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:03 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:03 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:28:03.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:04 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: damage ls {prefix=damage ls} (starting...)
Nov 23 16:28:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Nov 23 16:28:04 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3541442377' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Nov 23 16:28:04 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: dump loads {prefix=dump loads} (starting...)
Nov 23 16:28:04 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:04 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:04 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Nov 23 16:28:04 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Nov 23 16:28:04 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Nov 23 16:28:04 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3636553393' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Nov 23 16:28:04 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Nov 23 16:28:04 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Nov 23 16:28:04 np0005532763 nova_compute[231311]: 2025-11-23 21:28:04.962 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:28:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:28:05.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:05 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Nov 23 16:28:05 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:28:05 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Nov 23 16:28:05 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1787544456' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Nov 23 16:28:05 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: get subtrees {prefix=get subtrees} (starting...)
Nov 23 16:28:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:05 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:05 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:05 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: ops {prefix=ops} (starting...)
Nov 23 16:28:05 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Nov 23 16:28:05 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4190709229' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Nov 23 16:28:05 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:05 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:05 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:28:05.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:06 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: session ls {prefix=session ls} (starting...)
Nov 23 16:28:06 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Nov 23 16:28:06 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/558271612' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 23 16:28:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:06 np0005532763 ceph-mds[84968]: mds.cephfs.compute-2.utubtn asok_command: status {prefix=status} (starting...)
Nov 23 16:28:06 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:06 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:06 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Nov 23 16:28:06 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3684320480' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 23 16:28:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:28:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:28:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:28:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:28:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:28:06 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:28:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:28:07 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:28:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Nov 23 16:28:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/212132359' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Nov 23 16:28:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:28:07.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Nov 23 16:28:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1731456197' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 23 16:28:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:07 np0005532763 nova_compute[231311]: 2025-11-23 21:28:07.223 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:28:07 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:07 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Nov 23 16:28:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/425593023' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Nov 23 16:28:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 23 16:28:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3816304753' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 23 16:28:07 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:07 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:07 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:28:07.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Nov 23 16:28:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/581907728' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Nov 23 16:28:07 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Nov 23 16:28:07 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/581907728' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Nov 23 16:28:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:08 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:08 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:08 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Nov 23 16:28:08 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3896004946' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Nov 23 16:28:08 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Nov 23 16:28:08 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/902183333' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Nov 23 16:28:08 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Nov 23 16:28:08 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/351794627' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 23 16:28:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000029s ======
Nov 23 16:28:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:28:09.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Nov 23 16:28:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Nov 23 16:28:09 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2618177895' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Nov 23 16:28:09 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:09 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:09 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Nov 23 16:28:09 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3599506863' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d799ab800 session 0x557d7c149c20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879822 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 71.702980042s of 71.722465515s, submitted: 3
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881334 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880743 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880743 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7c4afc00 session 0x557d7c4430e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880743 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880743 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73449472 unmapped: 958464 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880743 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73457664 unmapped: 950272 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880743 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880743 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 36.037124634s of 36.047756195s, submitted: 2
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d799ab800 session 0x557d7c9ea000
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880152 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73465856 unmapped: 942080 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880152 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880152 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.927753448s of 16.932226181s, submitted: 1
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73474048 unmapped: 933888 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73482240 unmapped: 925696 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73490432 unmapped: 917504 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d799aa800 session 0x557d7c116f00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73498624 unmapped: 909312 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881664 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 55.639171600s of 55.643314362s, submitted: 1
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883176 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883176 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882585 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882585 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882585 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73506816 unmapped: 901120 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 892928 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 892928 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 892928 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d79d0fc00 session 0x557d7c9eb2c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 892928 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882585 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 892928 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 892928 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73515008 unmapped: 892928 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882585 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882585 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882585 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73523200 unmapped: 884736 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 876544 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 876544 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 876544 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73531392 unmapped: 876544 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882585 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 49.515476227s of 49.525783539s, submitted: 2
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881994 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881994 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881994 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881994 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73539584 unmapped: 868352 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881994 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.195587158s of 25.200448990s, submitted: 1
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 883506 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73555968 unmapped: 851968 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7a59d800 session 0x557d7c9f25a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 885018 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 885018 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73564160 unmapped: 843776 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 885018 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 827392 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 827392 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.715751648s of 19.726030350s, submitted: 2
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 827392 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73580544 unmapped: 827392 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73596928 unmapped: 811008 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888042 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887451 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73613312 unmapped: 794624 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 786432 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 786432 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 786432 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 786432 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 786432 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73621504 unmapped: 786432 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73629696 unmapped: 778240 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73629696 unmapped: 778240 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73637888 unmapped: 770048 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:09 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:09 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:28:09.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73654272 unmapped: 753664 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73670656 unmapped: 737280 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7c620400 session 0x557d7ce4c000
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread fragmentation_score=0.000025 took=0.000041s
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73678848 unmapped: 729088 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 720896 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 720896 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 720896 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 720896 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73687040 unmapped: 720896 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886860 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 704512 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 704512 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 704512 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73703424 unmapped: 704512 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 62.199523926s of 62.215599060s, submitted: 4
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73719808 unmapped: 688128 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888372 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 679936 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 679936 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 679936 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 679936 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73728000 unmapped: 679936 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887781 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 671744 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73736192 unmapped: 671744 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73744384 unmapped: 663552 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887190 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73752576 unmapped: 655360 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887190 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73768960 unmapped: 638976 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73768960 unmapped: 638976 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887190 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887190 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d79d0f000 session 0x557d7c9efa40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887190 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73777152 unmapped: 630784 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887190 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 6101 writes, 25K keys, 6101 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 6101 writes, 1158 syncs, 5.27 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 516 writes, 817 keys, 516 commit groups, 1.0 writes per commit group, ingest: 0.27 MB, 0.00 MB/s#012Interval WAL: 516 writes, 256 syncs, 2.02 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557d78bc9350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557d78bc9350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 581632 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 581632 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 581632 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73826304 unmapped: 581632 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 573440 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887190 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 573440 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 573440 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 573440 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 573440 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 573440 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887190 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73834496 unmapped: 573440 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 47.645763397s of 47.658298492s, submitted: 3
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888702 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73859072 unmapped: 548864 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888111 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888111 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d799aa800 session 0x557d7ce4da40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888111 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888111 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 73867264 unmapped: 540672 heap: 74407936 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.442523956s of 28.453479767s, submitted: 2
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889623 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889623 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889032 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889032 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889032 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d799ab800 session 0x557d7ce80b40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74932224 unmapped: 524288 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889032 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.711717606s of 25.720556259s, submitted: 2
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 74973184 unmapped: 483328 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76251136 unmapped: 253952 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76300288 unmapped: 204800 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 172032 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 172032 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889032 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 172032 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 172032 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 172032 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 172032 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 172032 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889032 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76333056 unmapped: 172032 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.249409676s of 10.981097221s, submitted: 213
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76341248 unmapped: 163840 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 892056 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 892056 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d79d0fc00 session 0x557d7ce67680
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 891465 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76357632 unmapped: 147456 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 891465 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76357632 unmapped: 147456 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76357632 unmapped: 147456 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76357632 unmapped: 147456 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76357632 unmapped: 147456 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.958477020s of 22.970237732s, submitted: 3
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76357632 unmapped: 147456 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 892977 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76357632 unmapped: 147456 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76357632 unmapped: 147456 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894489 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 893307 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 893307 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 893307 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 893307 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76390400 unmapped: 114688 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 893307 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7a70c800 session 0x557d7c9f25a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 893307 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7a59d800 session 0x557d7c117e00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 893307 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 893307 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 46.180438995s of 46.197158813s, submitted: 4
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76406784 unmapped: 98304 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76423168 unmapped: 81920 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894819 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76423168 unmapped: 81920 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76423168 unmapped: 81920 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76423168 unmapped: 81920 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895740 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895740 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d7c4ae400 session 0x557d7c536f00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895740 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895740 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895740 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895740 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 38.757907867s of 38.770950317s, submitted: 3
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76431360 unmapped: 73728 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895149 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76439552 unmapped: 65536 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 43.226833344s of 43.231521606s, submitted: 1
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898173 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 ms_handle_reset con 0x557d799aa800 session 0x557d7cf09e00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898173 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898173 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xfa97e/0x1a8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 142 handle_osd_map epochs [142,143], i have 142, src has [1,143]
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.084466934s of 18.093536377s, submitted: 2
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903659 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76464128 unmapped: 40960 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76701696 unmapped: 17637376 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 145 ms_handle_reset con 0x557d799ab800 session 0x557d7ce67680
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76701696 unmapped: 17637376 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 145 heartbeat osd_stat(store_statfs(0x4fbdf7000/0x0/0x4ffc00000, data 0xd70ce8/0xe23000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76701696 unmapped: 17637376 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 146 ms_handle_reset con 0x557d79d0fc00 session 0x557d7ce4de00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 999818 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 146 heartbeat osd_stat(store_statfs(0x4fbdf4000/0x0/0x4ffc00000, data 0xd72e13/0xe27000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 146 handle_osd_map epochs [146,147], i have 146, src has [1,147]
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.925214767s of 10.168018341s, submitted: 38
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1001985 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf4000/0x0/0x4ffc00000, data 0xd72e13/0xe27000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1001394 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1001394 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 ms_handle_reset con 0x557d799aa800 session 0x557d7c9efe00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76578816 unmapped: 17760256 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1001394 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1001394 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1001394 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76587008 unmapped: 17752064 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.053102493s of 27.068101883s, submitted: 12
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1002906 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf1000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1001475 data_alloc: 218103808 data_used: 65536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf2000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 76595200 unmapped: 17743872 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 ms_handle_reset con 0x557d7c4ae400 session 0x557d7c336960
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 84492288 unmapped: 9846784 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 ms_handle_reset con 0x557d7c942000 session 0x557d7c336b40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf2000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 88104960 unmapped: 6234112 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1031723 data_alloc: 234881024 data_used: 11534336
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fbdf2000/0x0/0x4ffc00000, data 0xd74de5/0xe2a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 88104960 unmapped: 6234112 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 147 handle_osd_map epochs [147,148], i have 147, src has [1,148]
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.397347450s of 14.407382011s, submitted: 2
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 88104960 unmapped: 6234112 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fbdee000/0x0/0x4ffc00000, data 0xd76ed1/0xe2d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 88104960 unmapped: 6234112 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 88104960 unmapped: 6234112 heap: 94339072 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 149 ms_handle_reset con 0x557d7d332800 session 0x557d7c336f00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 91455488 unmapped: 12402688 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1090641 data_alloc: 234881024 data_used: 11534336
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 149 ms_handle_reset con 0x557d7d332c00 session 0x557d7c3374a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 91455488 unmapped: 12402688 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 91455488 unmapped: 12402688 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 149 ms_handle_reset con 0x557d799aa800 session 0x557d7c337680
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 149 heartbeat osd_stat(store_statfs(0x4fb7a5000/0x0/0x4ffc00000, data 0x13bf011/0x1476000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 91447296 unmapped: 12410880 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 149 ms_handle_reset con 0x557d7c4ae400 session 0x557d7c337860
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 149 ms_handle_reset con 0x557d7c942000 session 0x557d7c337c20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 91668480 unmapped: 12189696 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 91684864 unmapped: 12173312 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1096615 data_alloc: 234881024 data_used: 11681792
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97640448 unmapped: 6217728 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97665024 unmapped: 6193152 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 149 heartbeat osd_stat(store_statfs(0x4fb781000/0x0/0x4ffc00000, data 0x13e3034/0x149b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.443373680s of 13.634446144s, submitted: 43
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1141981 data_alloc: 234881024 data_used: 17829888
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fb77d000/0x0/0x4ffc00000, data 0x13e5006/0x149e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fb77d000/0x0/0x4ffc00000, data 0x13e5006/0x149e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1141981 data_alloc: 234881024 data_used: 17829888
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 97697792 unmapped: 6160384 heap: 103858176 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 103718912 unmapped: 4366336 heap: 108085248 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fab2f000/0x0/0x4ffc00000, data 0x2026006/0x20df000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 103350272 unmapped: 4734976 heap: 108085248 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 103989248 unmapped: 5144576 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 103989248 unmapped: 5144576 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245847 data_alloc: 234881024 data_used: 18145280
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f990a000/0x0/0x4ffc00000, data 0x20b1006/0x216a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104022016 unmapped: 5111808 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f990a000/0x0/0x4ffc00000, data 0x20b1006/0x216a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104062976 unmapped: 5070848 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104062976 unmapped: 5070848 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.709860802s of 14.013646126s, submitted: 123
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104079360 unmapped: 5054464 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f990a000/0x0/0x4ffc00000, data 0x20b1006/0x216a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104095744 unmapped: 5038080 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1241983 data_alloc: 234881024 data_used: 18153472
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104095744 unmapped: 5038080 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104095744 unmapped: 5038080 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104095744 unmapped: 5038080 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104095744 unmapped: 5038080 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f98f1000/0x0/0x4ffc00000, data 0x20d2006/0x218b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104095744 unmapped: 5038080 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1241983 data_alloc: 234881024 data_used: 18153472
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104251392 unmapped: 4882432 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f98e8000/0x0/0x4ffc00000, data 0x20db006/0x2194000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1242087 data_alloc: 234881024 data_used: 18153472
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f98e8000/0x0/0x4ffc00000, data 0x20db006/0x2194000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f98e8000/0x0/0x4ffc00000, data 0x20db006/0x2194000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7d333400 session 0x557d7c08c780
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c146000 session 0x557d7c555a40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c146000 session 0x557d7a087c20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104259584 unmapped: 4874240 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7a087a40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1242087 data_alloc: 234881024 data_used: 18153472
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c4ae400 session 0x557d7a086780
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105717760 unmapped: 3416064 heap: 109133824 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f98e8000/0x0/0x4ffc00000, data 0x20db006/0x2194000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c942000 session 0x557d7c0c5860
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.610626221s of 17.632879257s, submitted: 5
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7d333400 session 0x557d7c0c4960
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7d333400 session 0x557d7c443c20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c442f00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c146000 session 0x557d7c442d20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c4ae400 session 0x557d7c4434a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 106233856 unmapped: 10911744 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 106233856 unmapped: 10911744 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 106233856 unmapped: 10911744 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f91f0000/0x0/0x4ffc00000, data 0x27d2016/0x288c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 106233856 unmapped: 10911744 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298305 data_alloc: 234881024 data_used: 18677760
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 106233856 unmapped: 10911744 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f91f0000/0x0/0x4ffc00000, data 0x27d2016/0x288c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c942000 session 0x557d7c4432c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 11558912 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105644032 unmapped: 11501568 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f91f0000/0x0/0x4ffc00000, data 0x27d2016/0x288c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112066560 unmapped: 5079040 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f91f0000/0x0/0x4ffc00000, data 0x27d2016/0x288c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112107520 unmapped: 5038080 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347766 data_alloc: 234881024 data_used: 25698304
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f91f0000/0x0/0x4ffc00000, data 0x27d2016/0x288c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112107520 unmapped: 5038080 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112107520 unmapped: 5038080 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112107520 unmapped: 5038080 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112107520 unmapped: 5038080 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.472221375s of 13.600175858s, submitted: 24
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112148480 unmapped: 4997120 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347854 data_alloc: 234881024 data_used: 25698304
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f91f0000/0x0/0x4ffc00000, data 0x27d2016/0x288c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112148480 unmapped: 4997120 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f91ed000/0x0/0x4ffc00000, data 0x27d5016/0x288f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112181248 unmapped: 4964352 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112189440 unmapped: 4956160 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113737728 unmapped: 3407872 heap: 117145600 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115908608 unmapped: 4390912 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1401956 data_alloc: 234881024 data_used: 26808320
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 114925568 unmapped: 5373952 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8a4e000/0x0/0x4ffc00000, data 0x2f74016/0x302e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 114925568 unmapped: 5373952 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 114925568 unmapped: 5373952 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 114925568 unmapped: 5373952 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8a4a000/0x0/0x4ffc00000, data 0x2f78016/0x3032000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 114892800 unmapped: 5406720 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1413436 data_alloc: 234881024 data_used: 26882048
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.477304459s of 10.752337456s, submitted: 95
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 114892800 unmapped: 5406720 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 5152768 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 5152768 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c942000 session 0x557d7b4aa3c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7a04bc20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8a28000/0x0/0x4ffc00000, data 0x2f9a016/0x3054000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110256128 unmapped: 10043392 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7cf02f00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 10027008 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1252882 data_alloc: 234881024 data_used: 18665472
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110272512 unmapped: 10027008 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f98df000/0x0/0x4ffc00000, data 0x20e4006/0x219d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110288896 unmapped: 10010624 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110288896 unmapped: 10010624 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f98da000/0x0/0x4ffc00000, data 0x20e9006/0x21a2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110288896 unmapped: 10010624 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7d333000 session 0x557d7c9ee3c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7d332800 session 0x557d7c336960
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110313472 unmapped: 9986048 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1073048 data_alloc: 234881024 data_used: 12181504
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c9ee000
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105013248 unmapped: 15286272 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70c800 session 0x557d7c149a40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105013248 unmapped: 15286272 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105013248 unmapped: 15286272 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fac48000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fac48000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065848 data_alloc: 234881024 data_used: 12066816
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d79a99c20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fac48000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065848 data_alloc: 234881024 data_used: 12066816
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fac48000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105021440 unmapped: 15278080 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065848 data_alloc: 234881024 data_used: 12066816
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 24.807607651s of 25.014438629s, submitted: 70
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 15622144 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 15622144 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 15622144 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 15622144 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c942000 session 0x557d7ae2b680
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7d333000 session 0x557d7cea5a40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c08de00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 15622144 heap: 120299520 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d7c555a40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1096410 data_alloc: 234881024 data_used: 12066816
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70c800 session 0x557d7c055680
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa473000/0x0/0x4ffc00000, data 0x1550fe3/0x1609000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104955904 unmapped: 19677184 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104955904 unmapped: 19677184 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa473000/0x0/0x4ffc00000, data 0x1550fe3/0x1609000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104955904 unmapped: 19677184 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa473000/0x0/0x4ffc00000, data 0x1550fe3/0x1609000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c942000 session 0x557d7b2abc20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104955904 unmapped: 19677184 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef5c00 session 0x557d7c116000
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104955904 unmapped: 19677184 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1140250 data_alloc: 234881024 data_used: 12066816
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7d020d20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d7c1163c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 104988672 unmapped: 19644416 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 105193472 unmapped: 19439616 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa472000/0x0/0x4ffc00000, data 0x1551006/0x160a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa472000/0x0/0x4ffc00000, data 0x1551006/0x160a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194519 data_alloc: 234881024 data_used: 19894272
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa472000/0x0/0x4ffc00000, data 0x1551006/0x160a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1194519 data_alloc: 234881024 data_used: 19894272
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa472000/0x0/0x4ffc00000, data 0x1551006/0x160a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x417f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 16801792 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.159788132s of 22.263095856s, submitted: 29
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115286016 unmapped: 9347072 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115941376 unmapped: 8691712 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113426432 unmapped: 11206656 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1333801 data_alloc: 234881024 data_used: 20561920
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113426432 unmapped: 11206656 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113426432 unmapped: 11206656 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8e34000/0x0/0x4ffc00000, data 0x2777006/0x2830000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113426432 unmapped: 11206656 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113426432 unmapped: 11206656 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113016832 unmapped: 11616256 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1328289 data_alloc: 234881024 data_used: 20561920
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113016832 unmapped: 11616256 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8e1d000/0x0/0x4ffc00000, data 0x2796006/0x284f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113016832 unmapped: 11616256 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8e1d000/0x0/0x4ffc00000, data 0x2796006/0x284f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113049600 unmapped: 11583488 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113049600 unmapped: 11583488 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113049600 unmapped: 11583488 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1328289 data_alloc: 234881024 data_used: 20561920
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.521698952s of 12.875432968s, submitted: 169
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 11476992 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8e10000/0x0/0x4ffc00000, data 0x27a3006/0x285c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113188864 unmapped: 11444224 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113197056 unmapped: 11436032 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8e10000/0x0/0x4ffc00000, data 0x27a3006/0x285c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 11427840 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 11427840 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1328425 data_alloc: 234881024 data_used: 20561920
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8e10000/0x0/0x4ffc00000, data 0x27a3006/0x285c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 11427840 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 11427840 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 11427840 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 11427840 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 11427840 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1329345 data_alloc: 234881024 data_used: 20578304
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8e0d000/0x0/0x4ffc00000, data 0x27a6006/0x285f000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113205248 unmapped: 11427840 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 11419648 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 113213440 unmapped: 11419648 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70c800 session 0x557d7ce7f680
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.209961891s of 13.227853775s, submitted: 4
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108429312 unmapped: 16203776 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef5800 session 0x557d7a1aa3c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1081933 data_alloc: 234881024 data_used: 12066816
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1081933 data_alloc: 234881024 data_used: 12066816
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1081933 data_alloc: 234881024 data_used: 12066816
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1081933 data_alloc: 234881024 data_used: 12066816
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108437504 unmapped: 16195584 heap: 124633088 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.695468903s of 20.757802963s, submitted: 25
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef4400 session 0x557d7c149c20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 24166400 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170017 data_alloc: 234881024 data_used: 12066816
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 24166400 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9cc0000/0x0/0x4ffc00000, data 0x18f3fe3/0x19ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 24166400 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 24166400 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 24166400 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9cc0000/0x0/0x4ffc00000, data 0x18f3fe3/0x19ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 24166400 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9cc0000/0x0/0x4ffc00000, data 0x18f3fe3/0x19ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1170017 data_alloc: 234881024 data_used: 12066816
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108347392 unmapped: 24166400 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef4400 session 0x557d7c9ebe00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7d002000
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9cc0000/0x0/0x4ffc00000, data 0x18f3fe3/0x19ac000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d7b37a1e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108109824 unmapped: 24403968 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108109824 unmapped: 24403968 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108109824 unmapped: 24403968 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa839000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108109824 unmapped: 24403968 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1089434 data_alloc: 234881024 data_used: 12066816
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.831846237s of 10.980201721s, submitted: 47
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70cc00 session 0x557d7c7b5680
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70d000 session 0x557d7be034a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70c800 session 0x557d7b37b2c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70cc00 session 0x557d7b2aad20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70d000 session 0x557d7d0205a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108445696 unmapped: 24068096 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108445696 unmapped: 24068096 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa686000/0x0/0x4ffc00000, data 0xf2d045/0xfe6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108445696 unmapped: 24068096 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108445696 unmapped: 24068096 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108445696 unmapped: 24068096 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7d021680
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1111663 data_alloc: 234881024 data_used: 12066816
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa686000/0x0/0x4ffc00000, data 0xf2d045/0xfe6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108470272 unmapped: 24043520 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108150784 unmapped: 24363008 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa686000/0x0/0x4ffc00000, data 0xf2d045/0xfe6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108150784 unmapped: 24363008 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108150784 unmapped: 24363008 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108150784 unmapped: 24363008 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1116031 data_alloc: 234881024 data_used: 12627968
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108150784 unmapped: 24363008 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa686000/0x0/0x4ffc00000, data 0xf2d045/0xfe6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108150784 unmapped: 24363008 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 108150784 unmapped: 24363008 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa686000/0x0/0x4ffc00000, data 0xf2d045/0xfe6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 106979328 unmapped: 25534464 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa686000/0x0/0x4ffc00000, data 0xf2d045/0xfe6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 106979328 unmapped: 25534464 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1116031 data_alloc: 234881024 data_used: 12627968
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 106979328 unmapped: 25534464 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.528369904s of 16.666515350s, submitted: 46
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 109846528 unmapped: 22667264 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110518272 unmapped: 21995520 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110682112 unmapped: 21831680 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9e16000/0x0/0x4ffc00000, data 0x179d045/0x1856000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110010368 unmapped: 22503424 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182843 data_alloc: 234881024 data_used: 12701696
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110010368 unmapped: 22503424 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110010368 unmapped: 22503424 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9e16000/0x0/0x4ffc00000, data 0x179d045/0x1856000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182859 data_alloc: 234881024 data_used: 12701696
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9e16000/0x0/0x4ffc00000, data 0x179d045/0x1856000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182859 data_alloc: 234881024 data_used: 12701696
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9e16000/0x0/0x4ffc00000, data 0x179d045/0x1856000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a0c2000 session 0x557d7ce7fe00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7ce7eb40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a0c2000 session 0x557d7c9eb4a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 22495232 heap: 132513792 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70c800 session 0x557d7a0863c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.219844818s of 16.448907852s, submitted: 71
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70cc00 session 0x557d7c7b52c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70d000 session 0x557d7c443e00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c9ee1e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a0c2000 session 0x557d7ce7e000
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70c800 session 0x557d7c7b50e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111042560 unmapped: 27295744 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111042560 unmapped: 27295744 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265830 data_alloc: 234881024 data_used: 12701696
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111042560 unmapped: 27295744 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111042560 unmapped: 27295744 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111042560 unmapped: 27295744 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9420000/0x0/0x4ffc00000, data 0x2192055/0x224c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9420000/0x0/0x4ffc00000, data 0x2192055/0x224c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111050752 unmapped: 27287552 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111050752 unmapped: 27287552 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1265830 data_alloc: 234881024 data_used: 12701696
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111050752 unmapped: 27287552 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70cc00 session 0x557d7c116b40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9420000/0x0/0x4ffc00000, data 0x2192055/0x224c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 28319744 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 110043136 unmapped: 28295168 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1339792 data_alloc: 234881024 data_used: 21274624
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f93fb000/0x0/0x4ffc00000, data 0x21b6078/0x2271000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1339792 data_alloc: 234881024 data_used: 21274624
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f93fb000/0x0/0x4ffc00000, data 0x21b6078/0x2271000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 115728384 unmapped: 22609920 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.856174469s of 20.003021240s, submitted: 38
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118865920 unmapped: 19472384 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 18563072 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1466068 data_alloc: 234881024 data_used: 22339584
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 18563072 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8673000/0x0/0x4ffc00000, data 0x2f3e078/0x2ff9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 18563072 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 18563072 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119775232 unmapped: 18563072 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8673000/0x0/0x4ffc00000, data 0x2f3e078/0x2ff9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119808000 unmapped: 18530304 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1467556 data_alloc: 234881024 data_used: 22343680
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119939072 unmapped: 18399232 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119947264 unmapped: 18391040 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7ce80960
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c0fa5a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119963648 unmapped: 18374656 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.752678871s of 10.058499336s, submitted: 128
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7ce7fe00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112058368 unmapped: 26279936 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9e15000/0x0/0x4ffc00000, data 0x179d045/0x1856000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112058368 unmapped: 26279936 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1202264 data_alloc: 234881024 data_used: 11063296
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9e15000/0x0/0x4ffc00000, data 0x179d045/0x1856000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112058368 unmapped: 26279936 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112058368 unmapped: 26279936 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d7b4aa000
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef5800 session 0x557d7c055860
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 26263552 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a0c2000 session 0x557d7c5552c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9e16000/0x0/0x4ffc00000, data 0x179d045/0x1856000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1119422 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1119422 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1119422 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111222784 unmapped: 27115520 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111230976 unmapped: 27107328 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1119422 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111230976 unmapped: 27107328 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111230976 unmapped: 27107328 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111230976 unmapped: 27107328 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111230976 unmapped: 27107328 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111230976 unmapped: 27107328 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1119422 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111230976 unmapped: 27107328 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111230976 unmapped: 27107328 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111239168 unmapped: 27099136 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111239168 unmapped: 27099136 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 27090944 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1119422 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 27090944 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 27090944 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 34.560401917s of 34.893112183s, submitted: 105
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111263744 unmapped: 27074560 heap: 138338304 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa782000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,0,0,0,1,4])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c0c52c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7a387680
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d7cea4780
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef5800 session 0x557d7c9ee1e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70c800 session 0x557d7c9ef860
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111493120 unmapped: 36347904 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111493120 unmapped: 36347904 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253071 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 36339712 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c9ef0e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111501312 unmapped: 36339712 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c9ee960
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111509504 unmapped: 36331520 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f95d3000/0x0/0x4ffc00000, data 0x1fe0fe3/0x2099000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d7c9ee000
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef5800 session 0x557d7c9ef4a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111902720 unmapped: 35938304 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 111902720 unmapped: 35938304 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262404 data_alloc: 234881024 data_used: 11075584
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 27385856 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 26066944 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121774080 unmapped: 26066944 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121782272 unmapped: 26058752 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f95ae000/0x0/0x4ffc00000, data 0x2005006/0x20be000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121446400 unmapped: 26394624 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1387956 data_alloc: 251658240 data_used: 28327936
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121446400 unmapped: 26394624 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121446400 unmapped: 26394624 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121446400 unmapped: 26394624 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121446400 unmapped: 26394624 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121446400 unmapped: 26394624 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f95ae000/0x0/0x4ffc00000, data 0x2005006/0x20be000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1387956 data_alloc: 251658240 data_used: 28327936
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.526510239s of 17.687055588s, submitted: 31
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128393216 unmapped: 19447808 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8b13000/0x0/0x4ffc00000, data 0x2aa0006/0x2b59000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128761856 unmapped: 19079168 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8ae5000/0x0/0x4ffc00000, data 0x2ace006/0x2b87000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1487708 data_alloc: 251658240 data_used: 29290496
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8ae5000/0x0/0x4ffc00000, data 0x2ace006/0x2b87000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8ae2000/0x0/0x4ffc00000, data 0x2ad1006/0x2b8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1486548 data_alloc: 251658240 data_used: 29282304
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 8518 writes, 34K keys, 8518 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 8518 writes, 2145 syncs, 3.97 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2417 writes, 8796 keys, 2417 commit groups, 1.0 writes per commit group, ingest: 9.65 MB, 0.02 MB/s#012Interval WAL: 2417 writes, 987 syncs, 2.45 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8ae2000/0x0/0x4ffc00000, data 0x2ad1006/0x2b8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1486548 data_alloc: 251658240 data_used: 29282304
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8ae2000/0x0/0x4ffc00000, data 0x2ad1006/0x2b8a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1487156 data_alloc: 251658240 data_used: 29265920
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.798316956s of 20.030382156s, submitted: 95
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128270336 unmapped: 19570688 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1487388 data_alloc: 251658240 data_used: 29265920
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8ae0000/0x0/0x4ffc00000, data 0x2ad3006/0x2b8c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128278528 unmapped: 19562496 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7af13c00 session 0x557d7cb9c3c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c08c5a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 18423808 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 18423808 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 18423808 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f83ac000/0x0/0x4ffc00000, data 0x3206068/0x32c0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 18391040 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1548005 data_alloc: 251658240 data_used: 29265920
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c055680
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d7b4aa5a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 18374656 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef5800 session 0x557d7c9f34a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.146596909s of 11.267215729s, submitted: 38
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c93e400 session 0x557d7c0c5c20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128778240 unmapped: 19062784 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128778240 unmapped: 19062784 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8386000/0x0/0x4ffc00000, data 0x322a09b/0x32e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130711552 unmapped: 17129472 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133169152 unmapped: 14671872 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1594572 data_alloc: 251658240 data_used: 35155968
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133185536 unmapped: 14655488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133185536 unmapped: 14655488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133185536 unmapped: 14655488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8386000/0x0/0x4ffc00000, data 0x322a09b/0x32e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133193728 unmapped: 14647296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133193728 unmapped: 14647296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1595180 data_alloc: 251658240 data_used: 35184640
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133193728 unmapped: 14647296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8386000/0x0/0x4ffc00000, data 0x322a09b/0x32e6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133193728 unmapped: 14647296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.182846069s of 11.216772079s, submitted: 9
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 133218304 unmapped: 14622720 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8385000/0x0/0x4ffc00000, data 0x322b09b/0x32e7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137166848 unmapped: 10674176 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138264576 unmapped: 9576448 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1676692 data_alloc: 251658240 data_used: 36057088
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138297344 unmapped: 9543680 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138297344 unmapped: 9543680 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b62000/0x0/0x4ffc00000, data 0x3a4d09b/0x3b09000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138297344 unmapped: 9543680 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138297344 unmapped: 9543680 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138297344 unmapped: 9543680 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b62000/0x0/0x4ffc00000, data 0x3a4d09b/0x3b09000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1677148 data_alloc: 251658240 data_used: 36069376
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138297344 unmapped: 9543680 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b3f000/0x0/0x4ffc00000, data 0x3a7109b/0x3b2d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1676300 data_alloc: 251658240 data_used: 36073472
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.274935722s of 12.588764191s, submitted: 93
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b3b000/0x0/0x4ffc00000, data 0x3a7509b/0x3b31000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1676196 data_alloc: 251658240 data_used: 36073472
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b3b000/0x0/0x4ffc00000, data 0x3a7509b/0x3b31000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7ce67860
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1676116 data_alloc: 251658240 data_used: 36073472
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b38000/0x0/0x4ffc00000, data 0x3a7809b/0x3b34000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138289152 unmapped: 9551872 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.778336525s of 10.798649788s, submitted: 5
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138289152 unmapped: 9551872 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138289152 unmapped: 9551872 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138305536 unmapped: 9535488 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b34000/0x0/0x4ffc00000, data 0x3a7909b/0x3b35000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1676156 data_alloc: 251658240 data_used: 36073472
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138313728 unmapped: 9527296 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138321920 unmapped: 9519104 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b34000/0x0/0x4ffc00000, data 0x3a7909b/0x3b35000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138321920 unmapped: 9519104 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1676156 data_alloc: 251658240 data_used: 36073472
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138321920 unmapped: 9519104 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b37000/0x0/0x4ffc00000, data 0x3a7909b/0x3b35000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138321920 unmapped: 9519104 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.394987106s of 11.418004036s, submitted: 6
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138346496 unmapped: 9494528 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138403840 unmapped: 9437184 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138625024 unmapped: 9216000 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1675828 data_alloc: 251658240 data_used: 36073472
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b36000/0x0/0x4ffc00000, data 0x3a7a09b/0x3b36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [0,0,1])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1675996 data_alloc: 251658240 data_used: 36073472
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b36000/0x0/0x4ffc00000, data 0x3a7a09b/0x3b36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138739712 unmapped: 9101312 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138756096 unmapped: 9084928 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.908274651s of 12.600452423s, submitted: 229
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1676012 data_alloc: 251658240 data_used: 36073472
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b36000/0x0/0x4ffc00000, data 0x3a7a09b/0x3b36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b36000/0x0/0x4ffc00000, data 0x3a7a09b/0x3b36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138756096 unmapped: 9084928 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138756096 unmapped: 9084928 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138756096 unmapped: 9084928 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b36000/0x0/0x4ffc00000, data 0x3a7a09b/0x3b36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138756096 unmapped: 9084928 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138756096 unmapped: 9084928 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1676012 data_alloc: 251658240 data_used: 36073472
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b36000/0x0/0x4ffc00000, data 0x3a7a09b/0x3b36000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138756096 unmapped: 9084928 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b35000/0x0/0x4ffc00000, data 0x3a7b09b/0x3b37000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138756096 unmapped: 9084928 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138764288 unmapped: 9076736 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c0c4780
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7ceb23c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7b35000/0x0/0x4ffc00000, data 0x3a7b09b/0x3b37000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138764288 unmapped: 9076736 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c93e400 session 0x557d7ce7e780
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132513792 unmapped: 15327232 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1500695 data_alloc: 251658240 data_used: 28966912
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.117815018s of 10.239953041s, submitted: 35
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f879a000/0x0/0x4ffc00000, data 0x2ad8006/0x2b91000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132513792 unmapped: 15327232 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132513792 unmapped: 15327232 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132513792 unmapped: 15327232 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132513792 unmapped: 15327232 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132513792 unmapped: 15327232 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a70cc00 session 0x557d7ceb32c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7c08d4a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1499887 data_alloc: 251658240 data_used: 28966912
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c117860
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa814000/0x0/0x4ffc00000, data 0xd9f006/0xe58000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147879 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147879 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147879 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1147879 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 118743040 unmapped: 29097984 heap: 147841024 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 nova_compute[231311]: 2025-11-23 21:28:09.964 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7a5d8780
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7ae2b0e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c93e400 session 0x557d7b2c1a40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c9eef00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.234661102s of 27.350326538s, submitted: 42
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7a5d85a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7c443860
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7c4434a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7cef5800 session 0x557d7c442f00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c443e00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 41517056 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 41517056 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 41517056 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280731 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120045568 unmapped: 41517056 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c9eeb40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120111104 unmapped: 41451520 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120135680 unmapped: 41426944 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f97b0000/0x0/0x4ffc00000, data 0x1e01078/0x1ebc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f97b0000/0x0/0x4ffc00000, data 0x1e01078/0x1ebc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120397824 unmapped: 41164800 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f97b0000/0x0/0x4ffc00000, data 0x1e01078/0x1ebc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 35766272 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1399096 data_alloc: 251658240 data_used: 27426816
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 35766272 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7b4ab4a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7cea4780
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01000 session 0x557d7c9ee000
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f97b0000/0x0/0x4ffc00000, data 0x1e01078/0x1ebc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162712 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa627000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119357440 unmapped: 42205184 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162712 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01000 session 0x557d7ae2b0e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7c9eb4a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c9eab40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7c9eba40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.146602631s of 18.466539383s, submitted: 106
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119308288 unmapped: 42254336 heap: 161562624 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7c9ea000
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7a5d90e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d799aa800 session 0x557d7a5d85a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c0c5680
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7cea4000
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9c57000/0x0/0x4ffc00000, data 0x195bff3/0x1a15000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121626624 unmapped: 44138496 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121626624 unmapped: 44138496 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01000 session 0x557d7c9ea1e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121626624 unmapped: 44138496 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01000 session 0x557d7c0fa1e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121536512 unmapped: 44228608 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1259248 data_alloc: 234881024 data_used: 10432512
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7cea4d20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7d0205a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121487360 unmapped: 44277760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 44474368 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9c55000/0x0/0x4ffc00000, data 0x195c026/0x1a17000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 124338176 unmapped: 41426944 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7c0fbe00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01400 session 0x557d7c5552c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 124321792 unmapped: 41443328 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119963648 unmapped: 45801472 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1174413 data_alloc: 234881024 data_used: 10432512
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c0fa1e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 119971840 unmapped: 45793280 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: mgrc ms_handle_reset ms_handle_reset con 0x557d7a828000
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/844402651
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/844402651,v1:192.168.122.100:6801/844402651]
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: mgrc handle_mgr_configure stats_period=5
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1173148 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1173148 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1173148 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1173148 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa838000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1173148 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120029184 unmapped: 45735936 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 37.237335205s of 37.444107056s, submitted: 49
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7c1172c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7c7b4b40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120184832 unmapped: 45580288 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9ede000/0x0/0x4ffc00000, data 0x16d5045/0x178e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120184832 unmapped: 45580288 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1242171 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120184832 unmapped: 45580288 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120184832 unmapped: 45580288 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120184832 unmapped: 45580288 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120184832 unmapped: 45580288 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9ede000/0x0/0x4ffc00000, data 0x16d5045/0x178e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120184832 unmapped: 45580288 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1242171 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01000 session 0x557d7c7b41e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120487936 unmapped: 45277184 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120504320 unmapped: 45260800 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9eba000/0x0/0x4ffc00000, data 0x16f9045/0x17b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1312292 data_alloc: 234881024 data_used: 20238336
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9eba000/0x0/0x4ffc00000, data 0x16f9045/0x17b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9eba000/0x0/0x4ffc00000, data 0x16f9045/0x17b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9eba000/0x0/0x4ffc00000, data 0x16f9045/0x17b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9eba000/0x0/0x4ffc00000, data 0x16f9045/0x17b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1312292 data_alloc: 234881024 data_used: 20238336
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 123535360 unmapped: 42229760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.107950211s of 19.235004425s, submitted: 37
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 38871040 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126894080 unmapped: 38871040 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 38846464 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9a73000/0x0/0x4ffc00000, data 0x1b40045/0x1bf9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357756 data_alloc: 234881024 data_used: 20475904
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 38846464 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 38846464 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126918656 unmapped: 38846464 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9a73000/0x0/0x4ffc00000, data 0x1b40045/0x1bf9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 38813696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126951424 unmapped: 38813696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357756 data_alloc: 234881024 data_used: 20475904
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9a73000/0x0/0x4ffc00000, data 0x1b40045/0x1bf9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 38780928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 38780928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 38780928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 38780928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9a73000/0x0/0x4ffc00000, data 0x1b40045/0x1bf9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 38780928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357756 data_alloc: 234881024 data_used: 20475904
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9a73000/0x0/0x4ffc00000, data 0x1b40045/0x1bf9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126984192 unmapped: 38780928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 38772736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 126992384 unmapped: 38772736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.788988113s of 15.941099167s, submitted: 48
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01800 session 0x557d7c0fa5a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01c00 session 0x557d7cf02000
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7ceb2f00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9a73000/0x0/0x4ffc00000, data 0x1b40045/0x1bf9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1185834 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa49e000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1185834 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa49e000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa49e000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1185834 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120455168 unmapped: 45309952 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa49e000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120463360 unmapped: 45301760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120463360 unmapped: 45301760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120463360 unmapped: 45301760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1185834 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120463360 unmapped: 45301760 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa49e000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 45293568 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 45293568 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 45293568 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7b44bc20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7c9eb680
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120471552 unmapped: 45293568 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01000 session 0x557d7c9efe00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1185834 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c7b5a40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.021965027s of 22.198305130s, submitted: 38
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7c0fbc20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7a087860
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01c00 session 0x557d7cea5e00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00c00 session 0x557d7cb9c3c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7ceb32c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 44818432 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120954880 unmapped: 44810240 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9fd4000/0x0/0x4ffc00000, data 0x15de055/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 44802048 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 44802048 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 44802048 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1258869 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7ce7e780
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 44802048 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7c1161e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01c00 session 0x557d7ce810e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9fd4000/0x0/0x4ffc00000, data 0x15de055/0x1698000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00800 session 0x557d7ce7f4a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 44638208 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 44613632 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9b9f000/0x0/0x4ffc00000, data 0x1602065/0x16bd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 44736512 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 44916736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1315524 data_alloc: 234881024 data_used: 17928192
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 44916736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 44916736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 44916736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9b9f000/0x0/0x4ffc00000, data 0x1602065/0x16bd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 44916736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 44916736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1315524 data_alloc: 234881024 data_used: 17928192
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 44916736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f9b9f000/0x0/0x4ffc00000, data 0x1602065/0x16bd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 44916736 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7a387860
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7a387c20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01c00 session 0x557d7a3870e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00400 session 0x557d7ce81860
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.361158371s of 17.506059647s, submitted: 48
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00000 session 0x557d7ce812c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 44023808 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 121741312 unmapped: 44023808 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f90f8000/0x0/0x4ffc00000, data 0x20a9065/0x2164000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [0,3,3,1])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130891776 unmapped: 34873344 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1453440 data_alloc: 234881024 data_used: 19214336
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129777664 unmapped: 35987456 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129818624 unmapped: 35946496 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8c51000/0x0/0x4ffc00000, data 0x254f065/0x260a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129818624 unmapped: 35946496 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129818624 unmapped: 35946496 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129818624 unmapped: 35946496 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1453290 data_alloc: 234881024 data_used: 19423232
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7a04b680
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7c9eb680
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129818624 unmapped: 35946496 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00000 session 0x557d7c9eb4a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00400 session 0x557d7c9eab40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129835008 unmapped: 35930112 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8c52000/0x0/0x4ffc00000, data 0x254f065/0x260a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129835008 unmapped: 35930112 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131047424 unmapped: 34717696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131047424 unmapped: 34717696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1482239 data_alloc: 234881024 data_used: 23588864
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131047424 unmapped: 34717696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131047424 unmapped: 34717696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131047424 unmapped: 34717696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8c52000/0x0/0x4ffc00000, data 0x254f065/0x260a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8c52000/0x0/0x4ffc00000, data 0x254f065/0x260a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131047424 unmapped: 34717696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131047424 unmapped: 34717696 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1482239 data_alloc: 234881024 data_used: 23588864
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8c52000/0x0/0x4ffc00000, data 0x254f065/0x260a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131080192 unmapped: 34684928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131080192 unmapped: 34684928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131080192 unmapped: 34684928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131080192 unmapped: 34684928 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.508337021s of 21.807014465s, submitted: 132
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8427000/0x0/0x4ffc00000, data 0x2d72065/0x2e2d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134545408 unmapped: 31219712 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1544079 data_alloc: 234881024 data_used: 23617536
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138452992 unmapped: 27312128 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7f42000/0x0/0x4ffc00000, data 0x325f065/0x331a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138682368 unmapped: 27082752 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138240000 unmapped: 27525120 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138240000 unmapped: 27525120 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7ea1000/0x0/0x4ffc00000, data 0x32f7065/0x33b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138272768 unmapped: 27492352 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1601253 data_alloc: 234881024 data_used: 25538560
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138272768 unmapped: 27492352 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138272768 unmapped: 27492352 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137764864 unmapped: 28000256 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7e89000/0x0/0x4ffc00000, data 0x3318065/0x33d3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137773056 unmapped: 27992064 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7e89000/0x0/0x4ffc00000, data 0x3318065/0x33d3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137781248 unmapped: 27983872 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1593645 data_alloc: 234881024 data_used: 25538560
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137789440 unmapped: 27975680 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.961110115s of 12.277703285s, submitted: 147
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137789440 unmapped: 27975680 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7e89000/0x0/0x4ffc00000, data 0x3318065/0x33d3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137789440 unmapped: 27975680 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7e89000/0x0/0x4ffc00000, data 0x3318065/0x33d3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137789440 unmapped: 27975680 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f7e89000/0x0/0x4ffc00000, data 0x3318065/0x33d3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137789440 unmapped: 27975680 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1593645 data_alloc: 234881024 data_used: 25538560
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137814016 unmapped: 27951104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca01c00 session 0x557d7a04ad20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c4ae800 session 0x557d7ceb25a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 137814016 unmapped: 27951104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7b2aab40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132710400 unmapped: 33054720 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132710400 unmapped: 33054720 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f91b9000/0x0/0x4ffc00000, data 0x1fe8065/0x20a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132710400 unmapped: 33054720 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1409603 data_alloc: 234881024 data_used: 19427328
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00800 session 0x557d7ceb2780
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7cea5c20
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 132718592 unmapped: 33046528 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.836650848s of 10.001040459s, submitted: 60
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7b2ab860
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212612 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212612 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212612 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212612 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128598016 unmapped: 37167104 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128606208 unmapped: 37158912 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128606208 unmapped: 37158912 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128606208 unmapped: 37158912 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128606208 unmapped: 37158912 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1212612 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128606208 unmapped: 37158912 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128606208 unmapped: 37158912 heap: 165765120 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7a5d8000
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7b4aa3c0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7c7b4960
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c4ae800 session 0x557d7c054b40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [0,0,0,0,0,1])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.840406418s of 25.882516861s, submitted: 12
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00800 session 0x557d7c0541e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00800 session 0x557d7b2aab40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7b2ab860
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7b2ab0e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7ceb2780
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128778240 unmapped: 40665088 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96ce000/0x0/0x4ffc00000, data 0x1ad4055/0x1b8e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128778240 unmapped: 40665088 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128778240 unmapped: 40665088 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1315173 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128786432 unmapped: 40656896 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128786432 unmapped: 40656896 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128786432 unmapped: 40656896 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c4ae800 session 0x557d7ceb21e0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 128786432 unmapped: 40656896 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7ceb3a40
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96ce000/0x0/0x4ffc00000, data 0x1ad4055/0x1b8e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96ce000/0x0/0x4ffc00000, data 0x1ad4055/0x1b8e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ae40000 session 0x557d7c9eb680
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7c147400 session 0x557d7c9eb4a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129114112 unmapped: 40329216 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1320008 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96a9000/0x0/0x4ffc00000, data 0x1af8065/0x1bb3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129114112 unmapped: 40329216 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 131227648 unmapped: 38215680 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134152192 unmapped: 35291136 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134160384 unmapped: 35282944 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96a9000/0x0/0x4ffc00000, data 0x1af8065/0x1bb3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134160384 unmapped: 35282944 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1414684 data_alloc: 234881024 data_used: 24420352
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134160384 unmapped: 35282944 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96a9000/0x0/0x4ffc00000, data 0x1af8065/0x1bb3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134193152 unmapped: 35250176 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96a9000/0x0/0x4ffc00000, data 0x1af8065/0x1bb3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134225920 unmapped: 35217408 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96a9000/0x0/0x4ffc00000, data 0x1af8065/0x1bb3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134225920 unmapped: 35217408 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134225920 unmapped: 35217408 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1414684 data_alloc: 234881024 data_used: 24420352
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134225920 unmapped: 35217408 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 134234112 unmapped: 35209216 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.733076096s of 19.895635605s, submitted: 44
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f96a9000/0x0/0x4ffc00000, data 0x1af8065/0x1bb3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 138870784 unmapped: 30572544 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 30326784 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 30326784 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1481696 data_alloc: 234881024 data_used: 25444352
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139116544 unmapped: 30326784 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8fe6000/0x0/0x4ffc00000, data 0x21bb065/0x2276000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139132928 unmapped: 30310400 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8fe6000/0x0/0x4ffc00000, data 0x21bb065/0x2276000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139132928 unmapped: 30310400 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139141120 unmapped: 30302208 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139157504 unmapped: 30285824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1480672 data_alloc: 234881024 data_used: 25448448
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4f8fe4000/0x0/0x4ffc00000, data 0x21bd065/0x2278000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139157504 unmapped: 30285824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139157504 unmapped: 30285824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139157504 unmapped: 30285824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 139157504 unmapped: 30285824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.296658516s of 12.521332741s, submitted: 87
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00800 session 0x557d7c08d4a0
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7ca00000 session 0x557d7c554f00
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130342912 unmapped: 39100416 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1235822 data_alloc: 234881024 data_used: 10539008
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a05ec00 session 0x557d7c442960
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130400256 unmapped: 39043072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130400256 unmapped: 39043072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130400256 unmapped: 39043072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130400256 unmapped: 39043072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130400256 unmapped: 39043072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130400256 unmapped: 39043072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130400256 unmapped: 39043072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130400256 unmapped: 39043072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130408448 unmapped: 39034880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130408448 unmapped: 39034880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130408448 unmapped: 39034880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130408448 unmapped: 39034880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:09 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130408448 unmapped: 39034880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130408448 unmapped: 39034880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130408448 unmapped: 39034880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130416640 unmapped: 39026688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130416640 unmapped: 39026688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130416640 unmapped: 39026688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130416640 unmapped: 39026688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130416640 unmapped: 39026688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130416640 unmapped: 39026688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130416640 unmapped: 39026688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130416640 unmapped: 39026688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130424832 unmapped: 39018496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130424832 unmapped: 39018496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130424832 unmapped: 39018496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130424832 unmapped: 39018496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130424832 unmapped: 39018496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130424832 unmapped: 39018496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130424832 unmapped: 39018496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130424832 unmapped: 39018496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130433024 unmapped: 39010304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130433024 unmapped: 39010304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130433024 unmapped: 39010304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130433024 unmapped: 39010304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130433024 unmapped: 39010304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130433024 unmapped: 39010304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130433024 unmapped: 39010304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130433024 unmapped: 39010304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130441216 unmapped: 39002112 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130441216 unmapped: 39002112 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130449408 unmapped: 38993920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130449408 unmapped: 38993920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130449408 unmapped: 38993920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130449408 unmapped: 38993920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130449408 unmapped: 38993920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130449408 unmapped: 38993920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130457600 unmapped: 38985728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130457600 unmapped: 38985728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130457600 unmapped: 38985728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130457600 unmapped: 38985728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130465792 unmapped: 38977536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130465792 unmapped: 38977536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130465792 unmapped: 38977536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130465792 unmapped: 38977536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130465792 unmapped: 38977536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130465792 unmapped: 38977536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130473984 unmapped: 38969344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130473984 unmapped: 38969344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130473984 unmapped: 38969344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130473984 unmapped: 38969344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130473984 unmapped: 38969344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130473984 unmapped: 38969344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130482176 unmapped: 38961152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130482176 unmapped: 38961152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130482176 unmapped: 38961152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130482176 unmapped: 38961152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130482176 unmapped: 38961152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130490368 unmapped: 38952960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130490368 unmapped: 38952960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130490368 unmapped: 38952960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130490368 unmapped: 38952960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130490368 unmapped: 38952960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130490368 unmapped: 38952960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'config diff' '{prefix=config diff}'
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130531328 unmapped: 38912000 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'config show' '{prefix=config show}'
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'counter dump' '{prefix=counter dump}'
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'counter schema' '{prefix=counter schema}'
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130572288 unmapped: 38871040 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130572288 unmapped: 38871040 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'log dump' '{prefix=log dump}'
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130596864 unmapped: 38846464 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'perf dump' '{prefix=perf dump}'
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'perf schema' '{prefix=perf schema}'
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130359296 unmapped: 39084032 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130367488 unmapped: 39075840 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130367488 unmapped: 39075840 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130367488 unmapped: 39075840 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130367488 unmapped: 39075840 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130367488 unmapped: 39075840 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130367488 unmapped: 39075840 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130367488 unmapped: 39075840 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130367488 unmapped: 39075840 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 39616512 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 39616512 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 39616512 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 39616512 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 39616512 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 39616512 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 39616512 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 39616512 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129835008 unmapped: 39608320 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129835008 unmapped: 39608320 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129835008 unmapped: 39608320 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129835008 unmapped: 39608320 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129835008 unmapped: 39608320 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129843200 unmapped: 39600128 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129843200 unmapped: 39600128 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129843200 unmapped: 39600128 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129843200 unmapped: 39600128 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129843200 unmapped: 39600128 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129843200 unmapped: 39600128 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129843200 unmapped: 39600128 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129843200 unmapped: 39600128 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129843200 unmapped: 39600128 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129843200 unmapped: 39600128 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129843200 unmapped: 39600128 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129851392 unmapped: 39591936 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129851392 unmapped: 39591936 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129859584 unmapped: 39583744 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 234881024 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129859584 unmapped: 39583744 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129859584 unmapped: 39583744 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129859584 unmapped: 39583744 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129859584 unmapped: 39583744 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129859584 unmapped: 39583744 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129867776 unmapped: 39575552 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129867776 unmapped: 39575552 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129875968 unmapped: 39567360 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129875968 unmapped: 39567360 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129875968 unmapped: 39567360 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129875968 unmapped: 39567360 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129875968 unmapped: 39567360 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129875968 unmapped: 39567360 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129875968 unmapped: 39567360 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129875968 unmapped: 39567360 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129875968 unmapped: 39567360 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129875968 unmapped: 39567360 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129875968 unmapped: 39567360 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129875968 unmapped: 39567360 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129875968 unmapped: 39567360 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129875968 unmapped: 39567360 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129875968 unmapped: 39567360 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129884160 unmapped: 39559168 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129884160 unmapped: 39559168 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129884160 unmapped: 39559168 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129884160 unmapped: 39559168 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129884160 unmapped: 39559168 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129892352 unmapped: 39550976 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129892352 unmapped: 39550976 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129900544 unmapped: 39542784 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129900544 unmapped: 39542784 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129900544 unmapped: 39542784 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129900544 unmapped: 39542784 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129900544 unmapped: 39542784 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129900544 unmapped: 39542784 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129900544 unmapped: 39542784 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129900544 unmapped: 39542784 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129900544 unmapped: 39542784 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129908736 unmapped: 39534592 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129908736 unmapped: 39534592 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129908736 unmapped: 39534592 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129908736 unmapped: 39534592 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129908736 unmapped: 39534592 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129908736 unmapped: 39534592 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129908736 unmapped: 39534592 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129916928 unmapped: 39526400 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129916928 unmapped: 39526400 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129916928 unmapped: 39526400 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129916928 unmapped: 39526400 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129916928 unmapped: 39526400 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129916928 unmapped: 39526400 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129916928 unmapped: 39526400 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129916928 unmapped: 39526400 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 39501824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 39501824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 39501824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 39501824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 39501824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 39501824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 39501824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 39501824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 39501824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 39501824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 39501824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 39501824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 39501824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 39501824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 39501824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129941504 unmapped: 39501824 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 39493632 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 39493632 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 39493632 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 39493632 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 39493632 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 39493632 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 39493632 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 39493632 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 39493632 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 39493632 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 39485440 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 39485440 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 39485440 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 39485440 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 39485440 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 39485440 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 39477248 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 39477248 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129982464 unmapped: 39460864 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129982464 unmapped: 39460864 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129982464 unmapped: 39460864 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129982464 unmapped: 39460864 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129982464 unmapped: 39460864 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129982464 unmapped: 39460864 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129982464 unmapped: 39460864 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129982464 unmapped: 39460864 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129982464 unmapped: 39460864 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129982464 unmapped: 39460864 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129982464 unmapped: 39460864 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129982464 unmapped: 39460864 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129982464 unmapped: 39460864 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129990656 unmapped: 39452672 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129990656 unmapped: 39452672 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129990656 unmapped: 39452672 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130007040 unmapped: 39436288 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130007040 unmapped: 39436288 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130007040 unmapped: 39436288 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130007040 unmapped: 39436288 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130007040 unmapped: 39436288 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130007040 unmapped: 39436288 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 39477248 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 39477248 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 39477248 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 39477248 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129966080 unmapped: 39477248 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 10K writes, 42K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 10K writes, 3067 syncs, 3.50 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2228 writes, 8044 keys, 2228 commit groups, 1.0 writes per commit group, ingest: 8.01 MB, 0.01 MB/s#012Interval WAL: 2228 writes, 922 syncs, 2.42 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129974272 unmapped: 39469056 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129982464 unmapped: 39460864 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129982464 unmapped: 39460864 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129990656 unmapped: 39452672 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129990656 unmapped: 39452672 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129990656 unmapped: 39452672 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129990656 unmapped: 39452672 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129990656 unmapped: 39452672 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129990656 unmapped: 39452672 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129990656 unmapped: 39452672 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129990656 unmapped: 39452672 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129998848 unmapped: 39444480 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130015232 unmapped: 39428096 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130015232 unmapped: 39428096 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130015232 unmapped: 39428096 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130015232 unmapped: 39428096 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130015232 unmapped: 39428096 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130015232 unmapped: 39428096 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130015232 unmapped: 39428096 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130015232 unmapped: 39428096 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130015232 unmapped: 39428096 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130015232 unmapped: 39428096 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130015232 unmapped: 39428096 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130015232 unmapped: 39428096 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130015232 unmapped: 39428096 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130015232 unmapped: 39428096 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130015232 unmapped: 39428096 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130015232 unmapped: 39428096 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130023424 unmapped: 39419904 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130023424 unmapped: 39419904 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130023424 unmapped: 39419904 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130023424 unmapped: 39419904 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130023424 unmapped: 39419904 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130023424 unmapped: 39419904 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130023424 unmapped: 39419904 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130023424 unmapped: 39419904 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130023424 unmapped: 39419904 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130023424 unmapped: 39419904 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa428000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130039808 unmapped: 39403520 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130039808 unmapped: 39403520 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228469 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130039808 unmapped: 39403520 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 317.148895264s of 317.299560547s, submitted: 41
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 130039808 unmapped: 39403520 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129105920 unmapped: 40337408 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129286144 unmapped: 40157184 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129351680 unmapped: 40091648 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129351680 unmapped: 40091648 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129351680 unmapped: 40091648 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129351680 unmapped: 40091648 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129351680 unmapped: 40091648 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129351680 unmapped: 40091648 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129351680 unmapped: 40091648 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129351680 unmapped: 40091648 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129351680 unmapped: 40091648 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129351680 unmapped: 40091648 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129359872 unmapped: 40083456 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129359872 unmapped: 40083456 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129359872 unmapped: 40083456 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129359872 unmapped: 40083456 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129359872 unmapped: 40083456 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129359872 unmapped: 40083456 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 40075264 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 40075264 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 40075264 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 40075264 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 40075264 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 40075264 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 40075264 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129368064 unmapped: 40075264 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 40067072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129376256 unmapped: 40067072 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129384448 unmapped: 40058880 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129392640 unmapped: 40050688 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129400832 unmapped: 40042496 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129409024 unmapped: 40034304 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 40026112 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 40026112 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 40026112 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 40026112 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 40026112 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 40026112 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 40026112 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 40026112 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 40026112 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 40017920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 40017920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 40017920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 40017920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 40017920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 40017920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 40017920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 40017920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 40017920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 40017920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 40017920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 40017920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 40017920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129425408 unmapped: 40017920 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 40009728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 40009728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 40009728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 40009728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 40009728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 40009728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 40009728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 40009728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 40009728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 40009728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 40009728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 40009728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 40009728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 40009728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 40009728 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 40001536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 40001536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 40001536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 40001536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 40001536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 40001536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 40001536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 40001536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 40001536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 40001536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 40001536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 40001536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 40001536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 40001536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 40001536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129441792 unmapped: 40001536 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 39993344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 39993344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 39993344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 39993344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 39993344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 39993344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 39993344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 39993344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 39993344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 39993344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 39993344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 39993344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 39993344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 39993344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 39993344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129449984 unmapped: 39993344 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 39985152 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 39976960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 39976960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 39976960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 39976960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 39976960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 39976960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 39976960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129466368 unmapped: 39976960 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 39968768 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 39968768 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 39968768 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 39968768 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 39968768 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 39968768 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 39968768 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 39968768 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 39968768 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 39968768 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 39968768 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 39968768 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 39968768 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129474560 unmapped: 39968768 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 39960576 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 39960576 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 39960576 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 39960576 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 39960576 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 39960576 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 39960576 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 39960576 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 39960576 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 39960576 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 39960576 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 39960576 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 39960576 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 39960576 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 39960576 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129482752 unmapped: 39960576 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 39952384 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 39952384 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 39952384 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 39952384 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 39952384 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 39952384 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 39952384 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 39952384 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 39952384 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 39952384 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 39952384 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 39952384 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 39952384 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 39952384 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 39952384 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 39952384 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 39944192 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 39944192 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 39944192 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 39944192 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 39944192 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 39944192 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 39944192 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129499136 unmapped: 39944192 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 39936000 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 39936000 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 39936000 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 39936000 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 39936000 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 39936000 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 39936000 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129507328 unmapped: 39936000 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 39927808 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 39927808 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 39927808 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 39927808 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 39927808 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 39927808 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 39927808 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 39927808 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 39927808 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129515520 unmapped: 39927808 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 39919616 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 39919616 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 39919616 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 39919616 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 39919616 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 39919616 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 39919616 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 39919616 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 39919616 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 39919616 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 39919616 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 39919616 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 39919616 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129523712 unmapped: 39919616 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 ms_handle_reset con 0x557d7a59d800 session 0x557d7c0fad20
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129531904 unmapped: 39911424 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 39903232 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 39903232 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1228177 data_alloc: 218103808 data_used: 10428416
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129540096 unmapped: 39903232 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'config diff' '{prefix=config diff}'
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'config show' '{prefix=config show}'
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129875968 unmapped: 39567360 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'counter dump' '{prefix=counter dump}'
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'counter schema' '{prefix=counter schema}'
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fa429000/0x0/0x4ffc00000, data 0xd7afe3/0xe33000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [0,1] op hist [])
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129548288 unmapped: 39895040 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: prioritycache tune_memory target: 4294967296 mapped: 129687552 unmapped: 39755776 heap: 169443328 old mem: 2845415832 new mem: 2845415832
Nov 23 16:28:10 np0005532763 ceph-osd[78269]: do_command 'log dump' '{prefix=log dump}'
Nov 23 16:28:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:28:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Nov 23 16:28:10 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3672847451' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Nov 23 16:28:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:10 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:10 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:10 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Nov 23 16:28:10 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1372842905' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Nov 23 16:28:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:28:11.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:11 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Nov 23 16:28:11 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/313308784' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Nov 23 16:28:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:11 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:11 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:11 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Nov 23 16:28:11 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1549457199' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Nov 23 16:28:11 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:11 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:11 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:28:11.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:28:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:28:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:28:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:28:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:28:11 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:28:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:28:12 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:28:12 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Nov 23 16:28:12 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2730715203' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Nov 23 16:28:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:12 np0005532763 nova_compute[231311]: 2025-11-23 21:28:12.265 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:28:12 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:12 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:12 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Nov 23 16:28:12 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3415009304' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Nov 23 16:28:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:28:13.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:13 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Nov 23 16:28:13 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1261971723' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Nov 23 16:28:13 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:13 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:13 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Nov 23 16:28:13 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/219328293' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Nov 23 16:28:13 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:13 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:13 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:28:13.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:13 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Nov 23 16:28:13 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1393218024' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Nov 23 16:28:14 np0005532763 podman[258583]: 2025-11-23 21:28:14.210202251 +0000 UTC m=+0.087075410 container health_status 096004674e98a3ffcb13688a7240e0e8e61da3ba99d2c00b8a89fa4dc8fd0d45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 23 16:28:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Nov 23 16:28:14 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4077139984' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Nov 23 16:28:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Nov 23 16:28:14 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2956273160' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Nov 23 16:28:14 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:14 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Nov 23 16:28:14 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2697393913' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Nov 23 16:28:14 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Nov 23 16:28:14 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/869848352' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Nov 23 16:28:14 np0005532763 nova_compute[231311]: 2025-11-23 21:28:14.967 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:28:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:28:15.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:28:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Nov 23 16:28:15 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2596138598' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Nov 23 16:28:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Nov 23 16:28:15 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2808481883' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Nov 23 16:28:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:15 np0005532763 systemd[1]: Starting Hostname Service...
Nov 23 16:28:15 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:15 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:15 np0005532763 systemd[1]: Started Hostname Service.
Nov 23 16:28:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Nov 23 16:28:15 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2971409743' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Nov 23 16:28:15 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Nov 23 16:28:15 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2531662656' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Nov 23 16:28:15 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:15 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:15 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:28:15.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:16 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Nov 23 16:28:16 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3758994834' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Nov 23 16:28:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:16 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Nov 23 16:28:16 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2172058909' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Nov 23 16:28:16 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:16 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:16 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Nov 23 16:28:16 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3086242074' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Nov 23 16:28:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:28:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Nov 23 16:28:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:28:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Nov 23 16:28:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:28:16 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Nov 23 16:28:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-nfs-cephfs-1-0-compute-2-dqbktw[235967]: 23/11/2025 21:28:17 : epoch 692377b8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Nov 23 16:28:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:28:17.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:17 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 16:28:17 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 16:28:17 np0005532763 nova_compute[231311]: 2025-11-23 21:28:17.270 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:28:17 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:17 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:17 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 16:28:17 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 16:28:17 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:17 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:17 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:28:17.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:18 np0005532763 nova_compute[231311]: 2025-11-23 21:28:18.091 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:28:18 np0005532763 nova_compute[231311]: 2025-11-23 21:28:18.092 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:28:18 np0005532763 nova_compute[231311]: 2025-11-23 21:28:18.092 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:28:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:18 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Nov 23 16:28:18 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2962810161' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Nov 23 16:28:18 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:18 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:18 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Nov 23 16:28:18 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1555797194' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Nov 23 16:28:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:28:19.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Nov 23 16:28:19 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/272507563' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Nov 23 16:28:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:19 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:19 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:19 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Nov 23 16:28:19 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/126716860' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Nov 23 16:28:19 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:19 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.000000000s ======
Nov 23 16:28:19 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.102 - anonymous [23/Nov/2025:21:28:19.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Nov 23 16:28:19 np0005532763 nova_compute[231311]: 2025-11-23 21:28:19.969 231315 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 23 16:28:20 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Nov 23 16:28:20 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Nov 23 16:28:20 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Nov 23 16:28:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:20 np0005532763 nova_compute[231311]: 2025-11-23 21:28:20.383 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:28:20 np0005532763 nova_compute[231311]: 2025-11-23 21:28:20.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 23 16:28:20 np0005532763 nova_compute[231311]: 2025-11-23 21:28:20.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 23 16:28:20 np0005532763 nova_compute[231311]: 2025-11-23 21:28:20.397 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 23 16:28:20 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:20 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:20 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Nov 23 16:28:20 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1345965444' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Nov 23 16:28:21 np0005532763 radosgw[84112]: ====== starting new request req=0x7ff1de8425d0 =====
Nov 23 16:28:21 np0005532763 radosgw[84112]: ====== req done req=0x7ff1de8425d0 op status=0 http_status=200 latency=0.001000028s ======
Nov 23 16:28:21 np0005532763 radosgw[84112]: beast: 0x7ff1de8425d0: 192.168.122.100 - anonymous [23/Nov/2025:21:28:21.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Nov 23 16:28:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-nfs-cephfs-compute-2-cpybdt[86087]: Sun Nov 23 21:28:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:21 np0005532763 nova_compute[231311]: 2025-11-23 21:28:21.382 231315 DEBUG oslo_service.periodic_task [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 23 16:28:21 np0005532763 nova_compute[231311]: 2025-11-23 21:28:21.383 231315 DEBUG nova.compute.manager [None req-18a455d4-6808-496a-bfe6-1c9c25eae50a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 23 16:28:21 np0005532763 ceph-03808be8-ae4a-5548-82e6-4a294f1bc627-keepalived-rgw-default-compute-2-zjypck[86974]: Sun Nov 23 21:28:21 2025: (VI_0) received an invalid passwd!
Nov 23 16:28:21 np0005532763 ceph-mon[75752]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Nov 23 16:28:21 np0005532763 ceph-mon[75752]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/546543363' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
